Andy Leonard is CSO of Linchpin People
, an SSIS Trainer, Consultant
, and developer; a Business Intelligence Markup Language (Biml)
developer; SQL Server database and data warehouse developer, community mentor, engineer, and farmer. He is a co-author of SQL Server 2012 Integration Services Design Patterns
. His background includes web application architecture and development, VB, and ASP. Andy loves the SQL Server Community!Note: Comments are moderated. Spam shall not pass! </GandalfVoice>
I have mixed emotions about codes of conduct. I respect the right of any organization – public or private, for-profit or not – to create, maintain, and enforce codes of conduct. At the same time, I find the need for such standards depressing… especially in professional organizations.
I am and have been a member of professional organizations that have a code of conduct. I was a Microsoft MVP for five years and I am currently a member of the Professional Association for SQL Server (PASS). Both have codes of conduct (Microsoft, PASS). Both codes of conduct include an “Or Else” section. At the time of this writing, Microsoft’s section reads:
Microsoft reserves the right to remove any participant from the MVP Award Program for violating this Code of Conduct. In the event of a violation, the MVP Global Program Manager and MVP Lead aligned with the individual will review the situation. The final determination on whether to remove a person from the Program is made on a case-by-case basis. When an MVP is removed from the program, we retire the remaining benefits and his/her access to Microsoft resources.
A the time of this writing, PASS’s section reads:
If a participant violates this Anti-Harassment Policy, the conference organizers may take any action they deem appropriate, including warning the offender or expelling the offender from the conference. No refunds will be granted to attendees expelled from the Summit due to violations of this policy.
If you are being harassed, witness harassment, or have any other concerns, please contact a member of conference staff immediately. Conference staff can be identified by their “Headquarters” shirts and are trained to respond appropriately.
An Anti-Harassment Review Committee (AHRC) made up of the Executive Manager and three members of the Board of Directors designated by the President will be authorized to take action in response to an incident or behavior that violates the Anti-Harassment Policy.
For more information about incident reporting and review, please see the Anti-Harrassment Policy Process [sic]. If you have any questions about the PASS Anti-Harassment Policy, please contact PASS Governance.
If you follow the Anti-Harrassment Policy Process link at the time of this writing, the document contains the following section:
After being notified of a reported violation the AHRC [Anti-Harassment Review Committee] will make a reasonable effort to convene as soon as possible. The AHRC will make reasonable efforts to speak with all principals. No action will be taken until the AHRC has made a reasonable effort to speak with the person accused of violating the anit-harassment policy. The accused person will be offered the opportunity to review the allegations and respond to the AHRC
I want to be clear about something: I have little tolerance for harassment. I support a person’s right to express themselves. I put on a uniform and swore to protect those rights for citizens of the US in my younger years. The uniform no longer fits, but I still rally to support the right of an individual to express themselves (even when I personally disagree with what they are expressing… but I digress…). That said, I believe my – and everyone’s – rights end when they encroach on the rights of another.
Both policies do a fair job responding to the concerns of the offended. I like that PASS’s policy involves communicating with the accused alleged offender. A major tenet of the US justice system is the right of the accused to face their accuser. That right has been modified in edge cases – sometimes understandably so, sometimes not – but the idea that the accused has an opportunity to respond to the charges against them is fundamental. It makes sense, too. If you were accused of something you would want someone in authority to hear your side of the story before rendering a decision, wouldn’t you? I would.
And herein lies my concern: I see something akin to due process for the allegedly offending individual accused in the PASS code of conduct, but I do not see it in the Microsoft MVP code of conduct.
So my question is this: If a Microsoft MVP is accused of violating the code of conduct and the accusation is of a subjective nature (“no disrespectful behavior”, for example – not something objective like “no NDA violations”), is the allegedly offending MVP asked for their side of the story? I see that the MVP Global Program Manager and the allegedly offending MVP’s Microsoft MVP Lead review the situation. I also see the decision on whether to remove an MVP is made on a case-by-case basis. I don’t see any provision for input from the allegedly offending individual MVP. So… does it happen? Are they asked for their side of the story?
This isn’t my only concern.
I worry about statutes of limitations. I happen to know both of these codes of conduct were reactionary; they were written to prevent future behavior based on past incidents. How far back do they apply? They don’t say. So if it comes to light that a member of one of these groups committed an atrocious breach some years ago, what recourse – if any – does the offended have?
Conversely, I worry about ex post facto, as well. The MVP Code currently states:
7. Stay abreast of the Codes of Conduct. Microsoft reserves the right to amend or change the Code of Conduct at any time without notice. You agree to periodically review this document (http://mvp.support.microsoft.com) to ensure you are doing your part.
My concern is in regards to the timing of such changes and their application. Can Microsoft take offense at something (a blog post asking questions about their code of conduct, hypothetically), update the code of conduct, and then apply enforcement retroactively?
I worry about uniform application of codes of conduct. For example, if one person behaves in a way that gets them removed from a PASS Summit or kicked out of the MVP program and another person – who happens to be better connected to leadership in PASS or Microsoft – behaves in precisely the same way, and one is removed and the other is not; that’s a concern.
I can list more concerns but this post is long enough. My concerns are not with the existence of codes of conduct in professional organizations; they lie with upholding, applying, and enforcing said codes of conduct.
Not surprisingly, I think part of the solution is more transparency. Microsoft, PASS, and other professional organizations can take a cue from the way media reports about crimes: the names of those involved and many details of the accusation can be kept confidential. Informing the public is optional. Informing the membership is vital, I believe. Transparency works against back-room deals and shadiness due to an individual’s “connectedness” to leadership. It helps ensure uniform application of the rules to everyone. And knowing there is something akin to due process makes everyone feel better about the existence of a code of conduct.
I am honored to welcome Tim Mitchell (blog | @Tim_Mitchell) to Linchpin People!
Tim brings years of experience consulting with SQL Server, Integration Services, and Business Intelligence to our growing organization. I am overjoyed to be able to work with my friend!
Rather than babble on about Linchpin People (using words like "synergy" and "world class"), I direct you to Tim's awesome remarks on his transition, and end with a simple "w00t!"
Brian Kelley (blog | @kbriankelley) delivered an awesome and at times scary - no offense, Grant Fritchey (blog | @GFritchey) - presentation today on Security and SSIS. The recording is now available here.
Red Gate rocks.
If you didn’t know that already, you know it now. The latest evidence to support this claim is the publication of the schedule for US SQL in the City events. They are:
- 9 Oct – Pasadena
- 11 Oct – Atlanta
- 14 Oct – Charlotte
Registration for these events opens 24 May.
October 15, 2013
8:45 am - 4:45 pm
Wake Forest University Charlotte Center
200 North College Street
Charlotte, NC 28202
Business Intelligence Markup Language (Biml) automates your BI patterns and eliminates the manual repetition that consumes most of your time. Come see why BI professionals around the world think Biml is the future of data integration and BI.
Registration is just $69. Register before July 15th and receive early bird discount of just $49. Breakfast, lunch, & refreshments are also included. Seating is limited. Register now to guarantee your spot.
Matt Masson and I are co-presenting Developing Extensions for SSIS 22 May 2013 at 11:00 AM EDT. If you’ve never heard Matt present, you are in for a treat. Matt is knowledgeable (he helped build Integration Services 2012!) and entertaining. This is going to be a good one, folks!
Join Matt Masson and Andy Leonard for a discussion and demonstrations on extending SSIS with custom tasks and data flow components.
This code heavy session walks you through the creation of a custom SQL Server Integration Services (SSIS) Task, and a custom Data Flow component. If you've ever thought of customizing SSIS functionality beyond what you can do with scripting, this talk is for you! All examples will be in coded in C#, and made available following the session.
Let’s begin with an assertion:
“People are more important than process.” – Andy, circa 2008
Whenever an enterprise or institution adopts a new process or policy, that policy should serve people and not shackle them. The nicest thing that can be said about a policy that binds people is, “It wasn’t well thought out.” It’s a bad idea, in other words. What’s the logical thing to do when we encounter a bad idea? Reverse it, as quickly as possible. Is it a good idea to hang on to the bad idea because (hypothetically) we’ve paid business consultants lots of money? or someone – even someone we like and respect – may lose face? No. It is not. Those are merely excuses for continuing the bad idea.
I’ve shared my thoughts on Performance-Based Management (PBM) in the past. I believe PBM is a bad idea and I request you take a few minutes to read the post at the link provided. Go ahead, I will wait right here until you’re done. Done? Good.
Performance-Based Management looks great on paper but it just doesn’t work in practice. PBM is based on faulty premises that we will address in a bit, but PBM’s defenders keep claiming “it hasn’t been implemented properly.” Does this excuse sound familiar?
“Why is PBM So Bad, Andy?”
I’m glad you asked!
PBM is an effective way to squelch innovation. It’s not the only way, mind you; but PBM is as effective as non-compete agreements in killing creativity. I wrote about non-compete agreements in The Last Percent,
They [non-compete agreements] effectively shave the last sigma or two from the bell curve of technologist thought and capability. This is precisely the domain where the 100x, 1,000x, and 1,000,000x innovations and inventions dwell.
If you read my earlier post about PBM you see an example of how it kills teamwork – which is a surefire way to limit innovation.
“Why on earth would a technology company implement such a policy, Andy?” That is an excellent question. My answer:
PBM Looks Good on Paper
Unpracticed, PBM appears to solve a number of problems. Performance management is reduced to a bevy of statistics like lines-of-code written or defects closed. PBM claims to recognize an inconvenient truth; all employees are not created equal. Finally, PBM reduces the required skillset of the manager to a spreadsheet jockey; and spreadsheet jockeys are less expensive than experienced managers.
Managing statistics is a lot easier than managing people – especially innovative people. Innovative people are creative, and creative people are notoriously indisposed to authority in any form. Difficult to manage? Indeed, and impossible for some to manage. There are two effective ways to manage innovative individuals:
- Gain (or possess) their respect.
- Get out of their way.
A spreadsheet jockey is not going to gain the respect of an innovative person. Armed with their spreadsheet and need for metrics collection, the spreadsheet jockey is not going get out of the way. Because of this, PBM is a recipe for disaster for most innovative employees.
I believe the assertion that all employees are not created equal; some are better at their jobs than others. This, I hold, will always be true. But PBM focuses on one end of the less-than-awesome employee issue, putting the onus – and blame – on the employee. I asked a wildly unpopular question of management and HR at the company where a 20-60-20 PBM scheme was inflicted: “Are we hiring the wrong people or are we mismanaging the right people we hire 80% of the time?” I thought that was the right question at the time. I still do.
The problem is not that we are doing PBM wrong. The problem is that PBM is wrong. It doesn’t matter how we do it, it will never produce the desired results.
Therefore, PBM must die.
Or at least be neutered. One way to minimize the damage of PBM is to acknowledge there is but one true metric for innovative work: shipping. Creative work is art. And no artist likes to show (measure) the work until it is complete. Once the work is complete it can be measured as done. Why do I advocate a single, holistic metric? Because “shipped” accounts for nuance and intangibles that simply are not measurable. One PBM-neutering tactic is to define this single metric and be done with it. The historical metric is easy as it is a Boolean value: It either shipped or it did not ship. The current status is also simple: It is either done or in development.
“What about measuring how long the project took to ship, Andy?” I’m cool with that as long as you follow some simple rules:
- Feed this data into an evidence-based scheduling system
- Never use this information against the developer
- In fact, do not even make the developer aware of this information’s existence
- Use this information instead for the developer by adjusting any estimate according to previous evidence
Reporting “shipped” metrics would require a daily email a developer could compose in less than a minute (slightly-sarcastic mini-diatribe about the cost savings of the reduced need for management redacted).
In technology, “shipped” is the only metric that matters. Lines-of-code and defects closed are game-able.
I would like to see this abuse of people in the name process come to an end. Performance-Based Management is more than just a bad idea; it’s counter-productive, wastes time, and costs money. Elevating process above people is wrong and bad. Using statistics in this way is math poorly applied. Hence, Andy’s Law:
Statistics may be used to measure anything about people, except people.
In his article Healthcare's Big Problem With Little Data, author Dan Munro raises salient points about the state of health-related data. Electronic Health Records (EHR) were promoted as the end-all-be-all solution for the industry – a standardization that, I suppose, many thought would organically and naturally occur, stabilize, and be maintained.
It hasn’t. At least not yet.
My doctor and I speak about this almost each time I visit with him. The corporation that operates his practice nowadays seems endlessly locked in cycles of changing billing and EHR systems in search of low-cost compliance and integration. They’ve (literally) spent millions of dollars and my doctor hates the interfaces forced upon him and his patients (well, one, at least) hates the complexity of the billing and patient records systems. Can’t these systems all just get along?
The result? Higher medical data management costs. I’ll give you one guesses who pays these costs.
Munro posits the following from his article:
By at least one estimate (here) there are now about 500 independent EHR vendors. Out of that large group is a subset of about 400 with at least one customer that has applied for Federal stimulus dollars through the labyrinthine process of meaningful use attestation. That would suggest a “first-cut” of about 100 vendors who made some commitment around certification – but have no reported customers (at least to date). That’s a staggering number of single-purpose software vendors for any industry to support – even bloated healthcare. The simple fact is it can’t. While there have been a few high-profile cases of EHR vendors shutting down, this last week was the first high-profile example of a vendor that was effectively decertified by the Feds for both their “ambulatory” and their “inpatient” EHR products. From the HHS.gov website last Thursday:
“We and our certification bodies take complaints and our follow-up seriously. By revoking the certification of these EHR products, we are making sure that certified electronic health record products meet the requirements to protect patients and providers,” said Dr. Mostashari.“Because EHRMagic was unable to show that their EHR products met ONC’s certification requirements, their EHRs will no longer be certified under the ONC HIT Certification Program.”
You may ask yourself, well, how did we get here? This, folks, is a mess. What’s missing? Applied standards.
“But Andy, you’ve told us standards slow down development!”
And I stand by that statement; standards do slow down development…unless you’re building interfaces. And then standards become the means for decoupled snippets, functions, methods, applications, and even platforms to communicate with each other. In some cases, we simply cannot be productive without standards – like TCP/IP. What would happen if everyone coded their own version of internet traffic? If that was the case, very few of you would reading this post.
Yes, standards slow things down. And yes, they are necessary to insure base functionality. In my humble opinion, we have to get this right with healthcare data. We simply must. While we see similar issues of data management across many fields, medical data is too important to mess around with; it’s (often literally) life and death. And it is certainly a high cost.
More to Consider
Standards exist. Administering and certifying 400-500 vendor solutions is hard.
Part of the Solution
From the actions of the Department of Health and Human Services last week, one can ascertain HHS is taking steps to address the matter. But will all 400-500 companies voluntarily congeal their schemas? Possibly, but doubtful.
My experience delivering US state Medicaid ETL solutions informs me there will be a need for data integration – regardless of the existence of standards and in spite of certification. Why? Standards are not static. The idea of de facto standards emerges from the life cycle of software because software is organic. Even if everyone agreed on the same interpretation of rigid standards (and they won’t), versions 2.0 through n.n will – at a minimum – add fields to the schema. And with additional fields comes additional data.
Standards will be revised when enough product schemas adopt the de facto, and this will drive the need for yet more integration. Don’t take my word for it, examine the entropic history of ICD-9 and ICD-10 codes – the direction of progress is more data, not less.
This is one reason we at Linchpin People are focusing on Medical Data Integration. The recording of our first (free!) webinar about Medical Data Integration with SSIS 2012 is available here. Kent Bradshaw and I continue the series tomorrow presenting Medical Data Integration with SSIS 2012, Part 2 in which we focus on loading Provider and Drug data.
I hope to see you there!
Join SQL Server MVP Tim Mitchell (blog | @Tim_Mitchell) and I as we demonstrate and discuss the many uses of scripting in SQL Server 2012 Integration Services 8 May 2013 at 11:00 AM EDT! In this demo-packed session, two co-authors of the book SSIS Design Patterns share their experience using the Script Task and Script Component to accomplish difficult transformations and improve data integration performance.
Kent Bradshaw and I are pleased to announce another free webinar: Medical Data Integration with SSIS 2012, Part 2 - Providers and Drugs. Register today!
The recording of the first part of this series is available here.
That’s right, I have two free licenses for Red Gate’s SQL Source Control for Oracle that are burning a hole in my virtual pocket!
I like Red Gate’s SQL Source Control for SQL Server a lot. At Linchpin People, we are using SQL Source Control for several projects. It. just. works. It’s so cool to be able to store the database schema in the same source control engine as the other code (SSIS or .Net for us). We even use it with the Team Foundation Service which is free for teams of five or fewer users.
Barriers to entry are dropping like flies!
If you use Oracle and develop databases, I encourage you to participate in my little contest:
My Little Contest
To win, all you have to do is tell me about the cool stuff you’re going to do with your free SQL Source Control for Oracle license. Add a comment to this blog with your response. (Comments are moderated and will not appear right away.)
Winners will be announced in a few weeks.
Kent Bradshaw and I have been loading medical data for years. We worked together at Unisys and Molina Medicaid Solutions to build Extract, Transform, and Load (ETL) solutions for state Medicaid administrations in the US.
We decided to share some of our experience – some lessons learned, if you will. Beginning Wednesday, 10 Apr, at 11:00 AM EDT, we start a series of presentations on Medical Data Integration using SSIS 2012.
In this series, we will demonstrate:
- Loading member, provider, and claims data
- Efficient SSIS Design Patterns for medical data load
- Metadata-driven execution leveraging the SSIS 2012 Catalog
In Wednesday’s presentation we focus on loading claims data. You may register here. I hope to see you there!