Showing posts with label Self Promotion. Show all posts
Showing posts with label Self Promotion. Show all posts

Tuesday, April 10, 2012

Things I could have done for you this week.

I was thinking about finding a job and about a new post and I thought this might be interesting! 
Click here to view or download my resume!

  1. I could have written you 3 to 20 SSRS reports or setup some rss deployment scripts and report execution monitoring. I could have helped you get your dynamic drop down lists working or worked on your report template and setup a report footer page to include display the selected filters,  etc.
  2. Worked with a few of your clients and built them small 5 to 10 table applications exactly as the wanted, not just what they would have asked for.  I would build the application in front of their eyes as they were explaining what they wanted so there would be no possibility that they didn't get what they asked for. I would even determine the real data unique record constraints and make sure they were implemented!  MS Access 2010 rocks! Especially with Sharepoint and ADO!  I'm working on getting a few MS Access apps setup for download. The web interface still isn't great so they run much better opened in Office 2010. I'm still a bit leary of giving out links to my SharePoint site but will have some setup soon.  My DBATools on SQL Azure is going to take a few weeks. 
  3. Setup a job to email you a report of every table in every database on every server  that has missing clustered indexes: These table will not be de-fragmented by the DBCC defrag but more importantly clustered indexes determine your read and write performance.
  4. Setup a job to email you a report of every table in every database on every server that has missing primary keys.
  5. Setup a job to email you a report of every table in every database on every server that has missing real data unique record constraints.
  6. Setup a job to email you a report of every table in every database on every server that has missing referential integrity constraints. 
  7. Setup a lot of jobs for your servers like ones to use VBS and WMI to defragment the hard disks, update your fragmentation report data or use DOS calls to run the cleanup utility. It's not just the jobs. It's the code too! Do you have a library of functions and procedures? If not I just gave you a ton of code.
  8. Setup audit logging on every table in every database on every server with primary keys consisting of 5 or less concatenated columns.  With a little more work I could modified the code to work on larger concatenated keys or to use my preference a single unique surrogate key. Also setup the jobs to update, enable and disable the audit log triggers. They will need to be updated after every deployment involving table modifications. The audit log data will be moved to an audit log database.
  9. Setup a data import routine.  If you would rather spend weeks on every new data set / file you want to import feel free!  Evidently money grows on trees! BCP works pretty good though!
  10. Scanned your existing code and documented existing data issues that could save (or cost) you billions of dollars. Laugh if you want but the truth is scary!
  11. Configure my trace jobs to collect data for the database tuning advisor and configure the jobs to run the tuning advisor.
  12. Setup a job to collect population and selectivity data for every column in every table in every database on every server and to send you an excel spreadsheet of the data. The data your reporting on means anything it will be fully populated.
  13. Setup an anomaly detection database to start analyzing your database for bad data such as disposition dates without acquisition dates or disposition dates earlier then the acquisition date.
  14. Setup routines to use WinZip or 7Zip to zip all the files in a directory, delete the originals and move the zip to the SQL Server backups location or wherever you wanted.

Thursday, June 9, 2011

About me


I have been working in the IT field since 1992 and have specialized as a database professional working with Microsoft SQL Server since 1995.  And have also worked with Visual Basic/MS Access since 1992 and Crystal Reports since 1995.  I hold a bachelor’s degree in Computer Science from CSUS with a concentration in systems programming so I've done my share of programming in numerous languages and am a prolific coder.

I have extensive experience trouble shooting Microsoft SQL Server performance problems and developing Microsoft SQL Server views, user defined functions, stored procedures and data transfer packages (DTS) for ETL processes.  I have developed and  worked on thousands of SSRS and Crystal Reports and worked with many MS Access applications using it as a stand alone database or a front end for client server databases.  Additional skills include; process analysis, process automation, data warehouse / reporting database design, development, modification, population and optimization, data conversion, reconciliation, consolidation and cleanup.

While completing my last 3 years at CSUS I worked as a PC technician at the California State Controllers, Systems Development Division.  Upon completion of my degree, I started working as a contractor for Kaiser Permanente.  During this time I worked directly with the physicians in 5 different departments writing database applications and redesigning existing applications.   The applications were written in Paradox, MS Access, FileMaker Pro and Visual Basic.  After I had been contracting with them for some time there was a change in the company development strategy and most new application development was moved to Oakland.  Around this time I was hired as a temporary employee and placed on a team developing and maintaining an OBGYN Power Builder application.

In October of 1994, I left Kaiser for a lead programmer/analyst position at Access Health, Inc.  While at Access Health I led efforts to convert 200+ flat file based Reports from IQ for Windows to Crystal Reports/Oracle, developed an MS Access/Visual Basic based report scheduler used to schedule and send over 2000 reports a month to the printers complete with cover pages including recipient lists, where clause criteria, requester  information etc..

In November of 1995, after surviving a 40 percent layoff at Access Health ,I returned to Kaiser Permanente to take a full-time permanent Programmer Analyst position.  In May 1997, my prior supervisor from Access Health contacted me. He was hired as a CIO for a company which purchased, serviced and sold loans and was planning to transfer their IT and data services departments to Sacramento from Baton Rouge.  I accepted his offer to hire me as an analytical and technical leader to assist in the relocation. The first few months were spent working out of my home as I hired the first few employees.  After a few months we moved to a new location and I began flying back and forth between the Baton Rouge and Sacramento offices to document and assess their current business processes and applications.  During this period, I built an SQL Server based data warehouse/reporting environment and ETL procedures to consolidate data from an AS400 mainframe, multiple SQL Server databases, SAP and MS Access applications.  I also developed, maintained and upgraded a number of database and intranet applications and worked closely with the help desk to resolve application and data related issues.  In addition to these duties, I took on the job of lead reporting developer.  During these years, I developed and delivered monthly reports for 10 securitizations involving contracts worth hundreds of millions of dollars.

In March of 2003 after taking some time off, I started consulting independently.  Since then I have been consulting on both long-term and short-term contracts including the California Cancer Registry for 15 months and most recently at the California Pension Fund(CalPERS) for 3 years and 8 months

Here is a check list of some of the things I like to do with an SQL Server database.

  1. Check to see that there are natural key unique constraints and  primary keys on all tables to eliminate the possibility of duplicate records
  1. Review existing stored procedures and replace cursor based operations with set based operations
  1. Make sure there are no semi-random updates or repetitive updates. One way to ensure there are no semi-random updates is to run the population procedure to populate two copies of the same table using the same source data.  If there are any data discrepancies between the data in the resulting tables then it is likely that there are semi-random updates in the population code.
  1. Make sure stored procedures are designed with flexible input parameters to eliminate the need for multiple procedures that perform the same basic operations.  This minimizes maintenance and debugging time.
  1. With reporting stored procedures I like to standardize the input parameters and include a dynamic date range id.
  1. Set up fully automated server side tracing and execution of the index wizard to analyze and automatically apply or just email suggested index additions or deletions. 
  1. Ensure there are statistics on all tables.  Under certain circumstances the SQL Server 2000 sp_createstats function will die without an error message and not create statistics on all tables.  These missing statistics need to be identified and created to ensure efficient SQL Server execution plans.
  1. Setup audit logging on all tables.  I do this with a stored procedure that dynamically creates and applies audit log triggers on all tables and uses 2 stages to populate the final audit log table so the performance hit of full audit logging is minimized.
  1. To encapsulate business logic and reduce maintenance I like to create user defined functions and/or tables to perform repeatable operations.  This usually involves some trade off between performance and maintenance so it must be balanced according to the needs of the company and the amount of data being processed.

Here is a link to linked-In account, my facebook site and another to my youtube site. The Youtube site is mostly youtube Jiu-Jitsu however there is a playlist of a few hours of my ad-hoc guitar improvisations.