Let’s talk about SharePoint and PowerShell for a minute or 30. Any SharePoint admin who has been working with the technology for than a week or 2 should know that PowerShell is the secret to SharePoint automation and making administration duties fast and repeatable. My rule of thumb to those I mentor or talk to about SharePoint automation is “if you have to get the same data from SharePoint more than twice, automate it and keep the script.”
Let me diverge for a moment. Of all the technologies Microsoft has brought to market, other than the Xbox, my favorite would have to be Microsoft OneNote. If you haven’t tried it out yet, you are missing out. When I was in the army, we learned that you always had a notebook with you to capture anything you needed like orders from the brass, grid coordinates or other information. Microsoft has taken this concept and made it computerized. It is a traditional notebook format that can go anywhere you can. You can create one or multiple notebooks with one or multiple tabs with many different pages per tab. This allows for an almost infinite amount of possible configurations and way to store information. They include a print driver with it that will allow you to print TO OneNote. All those pages you want from web sites can be printed right into a notebook. Screen captures can go there too. Blog posts…they get a tab of their own and individual pages for each post. Add to all this SkyDrive integration and you can get your notebook anywhere you have an internet connection. If you have an iPhone or iPad, they have a free app for that. Windows Phone integration? Of course. I have stopped taking a laptop or notebook to meetings and just tap out notes and TODOs on the phone. But enough about OneNote…
As we talk about automation, I want to give an example. I recently started as the new SharePoint Architect for an e-government provider. As such, it is my job to understand where we have SharePoint implementations and how they are architected and configured. I was hired because SharePoint grows like kudzu and has started to overwhelm the staff. Once I had my feet firmly planted on the ground here, my boss gave me the task of inventorying all the SharePoint farms we currently support. First stop, DNS. Because of the forethought of a naming convention, 90-95% of our SharePoint servers were named with MOSS in the computer name. Let’s not get onto the fact that the term MOSS has been deprecated. That’s a rant post for later.
So I now had a list of most of the servers I would be dealing with. Next step was access. I talked to the AD guys and they were kind enough to grant me access to most of the boxes via group policy and the other boxes through direct contact. I quickly learned that about 15% of the servers were no longer active and had been shut down but not pulled from DNS. My list was growing smaller. At this point it because a task of logging into the Central Administration server in each farm and getting the basic farm information: servers in the farm, OS, RAM, SharePoint version (2007 or 2010), SP/CU levels, services (web, app, both) on each and any other information I thought prudent. This was a lot of work, but I got me to know the overall environment and figure out what was running where as well as eliminate any dead boxes.
By this point, you are asking yourself why I started out talking about PowerShell, right? I am getting there. Having basic information isn’t enough. I now had 8-9 farms with anywhere from 2-12 servers per farm. Some farms had as many as 186 content databases and a ton of host header site collections. No way did I want to gather all this by hand and have to do it regularly to see if anything had changed. I looked around for some basic scripts.
My plan was to create a site in SharePoint with several lists. The 4 basic lists were servers, web apps, databases and URLs with an additional list to tie them together called portals. because each state we manage is called a portal internally, it made sense to tie them together with a “portal” lookup list. I already had the server information and quickly hand-jammed that in (data-sheet view is amazing and might be the subject of a quick tip in the future). But what information did I need in the other lists.
Web apps consists of the name of the web application, the URL, the version (2007 or 2010), the portal (in my case the state) and the environment (production, authoring, staging, test, development). Databases consists of database name, current site count, status (online or offline), site collection warning level, maximum allowed site collections, the web application name (linked to web apps) and the portal. The URL list is pretty much just the URL, the web app (linked again) and the portal (for filtering purposes). All 3 of these lists were easily automated into a CSV file, opened in Excel and then cut/pasted into the data-sheet view of the appropriate list.
#Get all the web applications in the farm $webapps = Get-SPWebApplication #Declare the files we will write to $appFile = "AppInfo.csv" $dbFile = "DBInfo.csv" $urlFile = "URLInfo.csv" #Declare variables $portal = "Corp" $env = "Production" $version = "2010" #Create the header row for the CSV files Add-Content -Path $appFile -Value "WebApplication,URL,Version, Portal,Environment" Add-Content -Path $dbFile -Value "Name,CurrentSiteCount,Status, WarningSiteCount,MaximumSiteCountWebApplication,Portal,Production" Add-Content -Path $urlFile -Value "URL,WebApplicationPortal" #Loop though each web app foreach($webapp in $webapps) { #Add information about the web application Add-Content -Path $appFile -Value "$($webapp.Name),$($webapp.URL), $($version),$($portal),$($env)" #Loop through all the content databases in this web application foreach($db in $webapp.ContentDatabases) { #Add information about each database Add-Content -Path $dbFile -Value "$($db.Name), $($db.CurrentSiteCount),$($db.Status),$($db.WarningSiteCount), $($db.MaximumSiteCount),$($db.WebApplication.Name),$portal" } #Loop through all site collections in web application foreach($site in $webapp.Sites) { Add-Content -Path $urlFile -Value "$($site.URL),$($webapp.Name), $portal" } }
This will generate 3 CSV files that allows for each cut/paste operations. The only problem we run into is that this doesn’t work natively in SharePoint 2007. Because PowerShell can be very spotty there, we will need to come up with something else. I will be working on that in the near future, but luckily I have very few 2007 farms left in the environment and those that I do are smaller and easier to “hand-jam” into the lists.
Hopefully I have been able to give you the basics needed to gather information you might need from SharePoint on a regular basis and have shown you how it might be used to make your life easier. If you have any questions or have variations on this script that add useful data, feel free to comment and let me know. I am always looking for ways to increase the Xbox time and decrease the time spent gathering data.
