THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Argenis Fernandez

DBA Best Practices - A Blog Series: Episode 3 - Use a Wiki for Documentation

This blog has moved! You can find this content at the following new location:

https://www.0xsql.com/2013/02/13/dba-best-practices-a-blog-series-episode-3-use-a-wiki-for-documentation/

Published Wednesday, February 13, 2013 10:31 AM by Argenis

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

 

merrillaldrich said:

Excellent! We are implementing the SharePoint wiki for this and it's a fantastic way to handle ops-type documentation. SharePoint lists in the mix also have great uses (issues log, anyone?) :-). SharePoint obviously has its warts, but it's already present many places and this was a natural fit.

February 13, 2013 1:06 PM
 

AlexK said:

What happens when something changes and renders your wiki page incorrect?

Documentation gets out of sync with reality all-too-easily, you know...

February 13, 2013 1:22 PM
 

Jason said:

Thanks Argenis!  I've got a set of Powershell Scripts that I run that dump information into Excel files.  Things like database file locations, DB configuration settings.  Also instance level information such as Configuration information, logins.  Even have sheets that view down to OS level information like Memory, IPs, # of Procs, Cores, etc.  

Right now, I manually run these once or twice a month and upload to a site that the DBA team has access to.  I'll start thinking about using our SharePoint system fuller.

My scripts - http://www.the-fays.net/blog/?p=142

February 13, 2013 1:54 PM
 

Argenis said:

@Jason that's hilarious, somebody on Twitter was telling me about his own effort to do the same thing, only it saves the information to XML and not Excel. Perhaps we should get a CodePlex project going for this stuff...there's also the work that Chad Miller has done.

February 13, 2013 2:00 PM
 

Argenis said:

@AlexK - I'm not sure I follow your scenario. If you're careful with your templates and your doc. scripts, you can certainly minimize the likelihood of these "out of sync" issues arising.

February 13, 2013 2:04 PM
 

Brent Ozar said:

Two other important considerations with this kind of project.

One, you want it to be available when your datacenter is having major problems.  At one company, we had a wiki like this, but the database back end was stored on the SAN.  When we had a SAN outage, we desperately needed to know all of the affected systems, but ... you can see where that's going.

Two, you don't want it to be available to outsiders.  I'm not saying there's a security risk if someone gets a detailed list of IPs, OSs, hardware, etc - good security guys should be able to create that list from scratch quickly anyway - but amongst executives, there's at least the thought that this list is a security risk.  Setting aside the actual risk, the perceived risk is high enough that you probably don't want to just dump it in an insecure cloud or DMZ system in an overreaction to the first problem.

February 14, 2013 8:51 AM
 

Argenis said:

@Brent - great points, thanks for commenting.

February 14, 2013 10:52 AM
 

Ty said:

Tiddlywiki is a single HTML file wiki... We recently used it to document a sql application with great success.

February 16, 2013 10:37 AM
 

Hugo Mendes said:

Great article! A wiki is a great tool to store documentation. We currently use confluence which is easy to use and adopted well within the team - even for colleagues who are not fond of documentation they quickly saw how easy it was to update and then how good the search utility is (searches within uploaded content too).

One downside (to add to Brent's comment) is when we perform our yearly off-site DR test, this wiki server has to be built and recovered first (using printed off DR build documents), before we can recover any of our other servers.

On another note, we've also implemented a script (vbs/wmi) that runs weekly, which grabs all prod server information and writes to  individual server text files, which are then imported into an infrastructure database.  It’s really handy having these [servername_date_config.txt] files at hand.

Scripting really is the only way to ensure that your infrastructure database is up-to-date and correct. Humans are really good at forgetting to update server lists and it only takes one row of incorrect data to put the whole database integrity at stake prompting your manager to assign a ‘please check and update our entire server inventory table’ task to your name.

February 18, 2013 5:19 AM
 

sheenamrtin said:

I'm certainly very happy to read this blog site posts which carries plenty of helpful data, thanks for providing such information. https://www.jansen-display.co.uk/snap-frames

January 20, 2018 2:29 AM
 

fundraising counsel said:

January 22, 2018 11:12 PM
 

alisiajorge said:

September 14, 2018 12:11 AM
 

Fiona said:

It's not my very first time to visit this blog; I’m visiting this daily and acquire superb info from here day by day. https://www.huffingtonpost.com/entry/is-qualia-revolutionary-or-just-another-nootropic_us_597f4a21e4b0c69ef70529ce

September 30, 2018 11:10 PM

Leave a Comment

(required) 
(required) 
Submit
Privacy Statement