THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

The Rambling DBA: Jonathan Kehayias

The random ramblings and rantings of frazzled SQL Server DBA

How do you know if information online is accurate?

UPDATE: Based on feedback the author has made corrections to remove incorrect information from the article that should be published at some point.

Today an article on SQL Server Central titled SQL Server Memory Configuration, Determining MemToLeave Settings was published that has incorrect information and fails to adequately cover the subject.  I've left a comment on the article, but the subject of MemToLeave in SQL Server is to vast to cover in the comments on a article that are unlikely to be read by someone who has a problem with VAS in their SQL Server instance.  I'll be posting an article or blog post to address that topic in more detail in a bit.

The problem of knowing what information online is accurate is becoming bigger and bigger every day.  It isn't hard to get published online these days, anyone can start their own blog on one of the major blog sites, or even create their own domain and use a product like WordPress to have a site up and running in under a day.  Not all blogs are bad, and any blogger is bound to make an error at some point in time.  So how do you know if a blog is a good source for the information you found on it?  For me personally, a lot of it will have to do with who wrote the blog post, but if you are new to SQL Server, this won't be very useful.  From that stand point, I'd try and do a search for the persons name online and see what else they wrote.  If you are looking at Adam Machanic's blog and want to know if his information on splitting strings in SQLCLR is any good, a quick search on Google or Bing would give you an idea of who Adam is.  In addition to this, I'd look at comments on the blog, not necessarily a specific post but the entire blog.  If you find a lot of unanswered comments, or negative comments, you might question the content more.  Its really hit or miss with blog posts sometimes, so use caution with these.

Anyone can also submit articles to sites like SQL Server Central, SQL Team, Simple Talk, SearchSQLServer, and a long list of other sites.  Add to the mix the numerous forums groups that are also on these sites, and you have a mix of information that makes it difficult to know what is right and wrong.  To someone new to SQL Server, or the online communities, this is nothing less than the Perfect Storm of information, and what I find is that the sites that you might expect to have the most accurate and up to date information, have articles that fall short like the one today.  This discussion is not new one, Aaron Alton semi-covered this problem, though from a different angle, in his blog post A Publishers Responsibility.  I am pretty sure that I was involved in every one of those Twitter discussions on the problem of bad articles being published on big sites. 

It is easy to point fingers at who to blame for the problem of bad content article wise on big sites, but I don't know that it really is the answer.  I know Steve Jones from SQL Server Central, and I don't envy the job that he has managing the content that gets published on SQL Server Central.  It would take a team of experts in SQL Server to tech edit every article that gets published and that isn't financially or time permitting for most of the sites listed, though I haven't personally worked with all of them to publish content.  In most cases an article of questionable content to the community, may not raise any flags during the publishing process unless it is grossly incorrect.  The editors for the sites claim they don't have the wherewithal to be complete experts on SQL and run the sites as well.  Instead they rely on the community to more or less police itself with ratings and comments.  The disconnect here is the expectation that people actually read comments on an article before trying what the article suggests, and it fails to fix the problem, or makes the problem worse.

The online forums communities are a bit different when compared to sites that publish online articles.  The primary difference here is that these communities really do police themselves to a greater deal.  The other difference is that comments to a question can be read in the chronology of the forum thread.  I don't know anyone who reads a question and possible solution then jumps at trying that out.  People tend to read the entire thread to see what all was said during the history of the thread.  I tend to put a little bit of faith into what I find in online forums and newsgroups when I am searching for a problem.  Not to say that bad answers don't occur on forums, but when they do, they are more likely to be corrected by a subsequent post and you see this as a part of the thread. 

In addition to this, forums have rating/ranking systems in place that provide information about how active a member is and in some cases, how often their responses are the answer to the question being asked.  This can help aid in determining the persons level of credibility, but it can also be misleading.  I reply to SQL Server topics all the time on MSDN and I have a MVP tag associated with my Live ID.  If I reply in a Visual C# Express my profile is the same so you might think I know what I am talking about, but I am outside of my area of expertise, so use forum member rankings carefully when determining if content is correct or not.

The best source of online content is likely, the print magazine sites where the content is the digital version of the in print copy of their publication, TechNet, MSDN Mag, SQLMag, Windows IT Pro, and others.  These publishers are generally more picky about the content they allow simply because their users pay for the content.  This means that you are usually reading materials written by more established names in the industry like Paul Randal, Adam Machanic, Andrew Kelly, Kalen Delaney, Kevin Kline, or Itzik Ben Gan.  Following right behind these in credibility of information would be whitepaper's, and then digital copies of books (that are purchased legally, don't go horse trading online copies of books because authors put a lot of time into writing them).  Each of these forms of media go through a complex process of reviews to ensure that the information in them is both technically and grammatically correct.

So to recap, the level of credibility I give information based on where I find it in descending order of credibility would be:

 

  • Books
  • Whitepaper's
  • Online copies of Print Magazines
  • Forums/Newsgroups
  • Online Articles
  • Blogs

Now if you know the author of the Online Article or Blog post, then the information contained there should be considered credible, at least to the extent that anything else they wrote would be.  In reality unless you have something just completely obscure going on, you should be able to search for the solution, now that you think you've found one, and find it repeated in another source somewhere.  There are obscure things that you can still search for and you won't find much information, in which case, you might try utilizing the contact method available on a blog or article to get further information from the author.  The main takeaway from this is that you need to be careful what information you follow if you get it online.

Published Monday, July 06, 2009 8:53 PM by Jonathan Kehayias
Filed under:

Comments

 

Mike Walsh said:

Interesting points. I would even add that when reading a book caution should be the winner before just trying something in production. I blogged about it (if you can trust my blog ;-) ) when I wrote about empirical evidence here: http://www.straightpathsql.com/blog/2009/1/18/empirical-evidence.html

Test, Test, Test. Make sure you understand what you are reading and double check your sources. I don't want to see sites like SSC go through the pains of a rigorous tech editing process. The comments are great because (as you did today) the community ends up responding. Can't force folks to check comments first. Can't even force folks to test first but at some point there needs to be some onus/personal responsibility put on those who are just trying something without verification and testing.

I would say that if they fail to do so then maybe being a DBA isn't the right career path for them.

July 6, 2009 10:16 PM
 

Chris Wood said:

Jonathan,

Please get your blog on Mem to Leave out asap as I am fighting a Mem to Leave production problem that keeps throwing 701 messages on 32 bit SQL2005 SP2. I am learning more and more about Mem to Leave and CLR and Linked Servers using this special memory.

Thanks

Chris

July 6, 2009 10:23 PM
 

Jonathan Kehayias said:

Mike,

I'd have to agree, but that unfortunately isn't the way the world works.  I never thought that someone would try to create a numbers table using the max value of an integer when I used that in an example I wrote, but there was someone who did it, and filled his SQL Express database to 4GB in size which caused his application to fail.  It was all my fault though, at least according to the person who did it.

July 6, 2009 10:55 PM
 

Jonathan Kehayias said:

Chris,

Shoot me a contact, either through the contact link on here, or DM me on Twitter @SQLSarg, if you are on there, and I will try and help.

July 6, 2009 10:56 PM
 

Chris Wood said:

Jonathon,

I have read your VAS blog and it follows most of what I had seen before (I had bookmarked Christian's blog entry soon after it came out). My problem has been increased by starting to use Red Gate SQLBackup which I already knew used contiguous VAS from previous experience. I have an open Premium Support case with MS on this problem. The -g startup parameter is currently set at 396 and it looks like linked servers maybe our big problem. We had recently tried to switch from the MS provider to an Oracle Provider and tried to run 'out of process' but this is failing in testing. MS have provided a debug tool to collect stats once I have permission to get it running.

Our server is running W2K3 R2 SP2 Enterprise Edition with 8Gb and SQL2005 SP2 build 3282 Enterprise Edition with AWE set and min memory to 2Gb and max memory to 4Gb. There are some Access DB's on the same server. I have little chance of upgrading as next year this server will be replaced with one running W2K8 and SQL2008.

Thanks to MS I am running some queries on a regular basis thats shows a number of repeated pieces of application code getting stored in the procedure cache that possibly using Forced Parameterization could help but only after lots of testing. When we get our 701 problems it seems that a good portion of the procedure cache becomes unusable and on other occasions SQL will recover the memory for later use.

Thanks for listening. I saw you at PASS last November.

July 7, 2009 11:02 AM
 

Grumpy Old DBA said:

You're absolutely correct and I've been known to "have a go" at articles published in respected publications for being incorrect. We're all prone to getting it wrong from time to time ( myself included ), even ms themsleves sometimes get it wrong. It becomes worse on forums with people trying to post to gain status or MVP status where they often post without any knowledge or experience of the subject posted upon , it's how so many myths come about with wrong information. I too read the article about VAS and thought it was a good effort to cover a complex subject, although my own feeling is that the sooner we leave x32 behind the sooner some of these issues dimminish.Just as a footnote I've just been to a meeting with a storage vendor who is recommending raid 5 for transaction logs because there is no raid 5 overhead on sequential writes - please feel free to correct me if I've been doing it wrong for 18 years < grin >

July 8, 2009 7:39 AM
Anonymous comments are disabled

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement