
Unannounced Automatic Updates … Are They Really Better For Users?
Another monthly "Patch Tuesday" just passed by and, like many folks, I updated all my systems to include these updates. I also awoke Wednesday and Thursday to machines that had been rebooted because Windows patched itself and rebooted the computer. In both cases, I was upset as I had open documents that I had not saved. To make this easier, many software packages are going to background updating with no notice to users. I question if this is really better for users.
Everyone knows about software updates thanks to Microsoft and their Windows/Microsoft Update system. Starting out with the "Windows Update" website with Windows 95, the updates were delivered via the internet. Later versions of Windows had more integration with the Update service to the point of the current incarnation of Windows Update built into Windows 7 and soon to be released Windows 8. Microsoft gives the user many levels of customization around the update process. One option, set as the recommended standard by Microsoft, is to install critical updates and reboot after the installation. This has caused issues for many users where the computer is rebooted without letting the user know. Users have complained about losing data .
This has cause Microsoft to provide deep customizations around their updates to ensure no data loss.
Windows 8 changes this again. Having gone through a few months of patches with my Windows 8 installations, both Customer Preview and Release Preview, I prefer the new updater. Windows 8 performs the monthly patching using the Windows/Microsoft Update process as before. Users can customize this experience but the reboot is the key difference. Windows 8 gives the user notification that they should reboot within the next 3 days before it is automatically done. Finally, Microsoft is on the right path! The only thing better Microsoft can do is figure out to apply the updates without requiring reboots. As the Windows NT Core becomes more and more modular, this should be easier to do. Only the core elements would require the reboot while all subsystems could be restarted with new code.
Now, take a look at how Adobe, Mozilla and Google are doing their updates. Almost all of them have changed how they are doing their updates for their main products: Flash for Adobe, Firefox for Mozilla, and Chrome for Google. Their most current versions, as well as earlier versions of Chrome, are now setup to automatically download and install updates. If the default settings are used, all of them do this without notifying the user that there is a change. The only way to find the current version is to look in the package's "About this product" page or screen. I have not yet heard of issues with this process but a major concern is what happens if a bad release happens? Users would be confused as to why their computer wasn't working. A good example of this was Cisco's firmware update of Linksys E2700, E3500 and E4500 in late June. The update forced users to no longer use a local administrative system but a cloud-based system. There were issues with the cloud-based system and what information it tracked. With no other way to manage their routers, users are given no choice all cause by automatic updates. Cisco has reversed this but it is impacting their perception by users as many are not happy and some even returning their units.
As a manager of IT services, this concern is my biggest concern and makes me unwilling to support products that update automatically in the background. Within a managed environment, unannounced changes cause many problems. Microsoft created its monthly patching update cycle that is has around this design for enterprise environments. It is truly built around IT management systems. The updates are announced upon their delivery and allows IT teams to review and determine their risks for the organization. It also allows for testing cycles and deployment systems managed by the IT teams. The new unannounced automated updates do not allow for this.
With this movement to unannounced automated changes, some in the tech world think this change as the best thing for users. One argument is that it is good for developers as products keep improving, mentioning that it is similar to how web applications can be upgraded without user intervention. This is a bad comparison as web applications can be fully tested and conformed to "standards". Applications installed on a users' computer are more difficult. Did the software publisher check it in all configurations? This is much easier in controlled platforms like Apple's iOS and Mac OS X. With Microsoft's Windows platform and Linux based operating systems, this cannot be done easily. In one way, the fact that Microsoft can make Windows work on so many different configurations working with the hardware providers is absolutely amazing. I would suspect that Adobe, Mozilla and Google do not do this sort of in-depth testing.
I can see automatic unannounced updates for consumer users being a positive thing but personally do not like it at all. I have told Adobe to inform me of updates of Flash instead of just installing it. I am using a version of Firefox that does not have this automatic update when I need to use Firefox and have stayed on IE mostly for my personal use. To my dismay, Microsoft is now going to start performing automatic updates like Chrome and Firefox. My hope is that they offer a manage system for IT teams to control this process. Having worked at Microsoft, I wonder what the internal IT teams there think of this automatic update process.
Further automating the update process will make more users up-to-date and improve the overall security of the internet. Microsoft showed this with the move to the monthly patch process. Currently, statistics from security sources like Kaspersky Lab show a major shift in malware writers from attacking Windows directly to using other software as the attack vector, the most popular being Adobe Flash and Oracle/Sun Java. This opens up the malware folks to infecting more than just Windows, but Apple Mac and mobile devices like iOS and Google Android. The response to these threats is to do automated updates of those attack vectors. This helps users and increases security on the internet, but Microsoft has shown that a standard cadence can work. Adobe did try a standard cadence for updates to its products but has not been able to keep to a cadence due to the severity of their security issues being patched as of late. Instead of trying to make it work, they are moving to the models popularized by Google and, then, Mozilla.
The downside to all of this is the platform for upgrades. Every product seems to need to make its own product for monitoring for and applying new updates. Google and Mozilla both now install their own updater service that runs on the computer all the time and with administrative privileges. That is the only way for a service to run and install code without user intervention. My IT "spidey senses" go on high alert any time I hear this. Right now, on many home computers, there are most likely 5-10 updater services of some sort running. One solution is to have the operating system provide a standard mechanism for this sort of updating. Another is to use the task scheduling system of the operating system to schedule checks for updates. One great opportunity is the CoApp project headed up by Garrett Serack (@fearthecowboy) with many contributors. This could be a single updater that all the packages could use for their updates. Some sort of standardized and single point for updates would make users' systems run cleaner and happier.
The issue of unpatched systems on the internet is a major one for all of the computing world but especially for IT teams and their configuration management. In my review of ITIL/ITSM management philosophies, the configuration management part is the most critical aspect. Controlling change is how an IT team keeps a company running. It is the one area that most IT teams do not do well and it shows. If the push is to these unannounced automatic updates for web browsers and more companies use web tools to run their companies, how will they verify that all the web tools are going to work with each update? Will they see more Helpdesk calls from users confused when sites and tools don't work? What do you think?
Jared
Office 2013 and the Art of the Announcement
Microsoft took its show on the road to California again, this time San Francisco instead of Los Angeles like the Surface announcement. Steve Ballmer got up in front of the media to talk about how their strategy is coming together and specifically to announce the public betas of Office 2013. While I will talk a bit about the technology, I want to mostly talk in this post about how Microsoft has setup both the Surface and Office 2013 announcements as they want to take a page from the great showmanship of Apple in these announcements.
First, let's talk about the product. Microsoft Office is the leading productivity package used by people around the world. First starting as just Word back in 1983, Microsoft expanded on Word with the purchase of Forethought in 1987; adding PowerPoint to their productivity software selection. The first complete package of Office as we know it today was released in 1995 as Office 95. It was followed closely with an upgrade in 1997 aptly called Office 97, code-named "Ren and Stimpy". The year of release name followed along in further releases such as Office 2000, 2003, 2007, and 2010. The only exception to this naming convention was Office XP released in 2001 to co-inside with the launch of Windows XP.[1] What we see out of Microsoft today is a three year production cycle for Office, which is something that enterprise users can create a cadence with in either 3 or 6 year cycles.
With Office 2013, Microsoft is trying to simplify the user interface along with the ribbon. Changes include moving towards the "Metro" stylized ribbon through capitalizing the names, making the UI elements flatter, and simplifying looks and feel. The biggest irony for me is installing Office Professional Plus on my EP121 tablet running Windows 8 Release Preview as only had OneNote 2010 and Lync 2010 installed on here before. I use OneNote heavily and I love the new interface on the Desktop version in Windows 8. Hearing in the announcement that OneNote and Lync had Windows 8 Metro Experiences (aka MX) available, I went looking for them but did not see them anywhere. Thanks to Mary Jo Foley of ZDNet and All About Microsoft, these two apps will be available in the Windows Store shortly. But, even bigger thanks to fellow Krewe member Aubrey, I could install the Office 365 preview version of Office 2013 and get them installed now. Tempting for sure, but I will stay with this install and wait for the versions in the store. More information on the Office 2013 release is available from Win Super Site and All About Microsoft.
Now, here comes my complaints about the Apple-like showmanship with Surface and this announcement. First, I am an IT Manager and I have a day job. I blog because I like to write but it is not my job. My heart goes out to many that are journalists and I do not claim to know what they do. Having had drinks with a few and getting to know them, I do have some understanding of what they go through. Having said that and seen Paul Thurrott and Mary Jo Foley at MS TechEd in Orlando in June and watching Mary Jo Foley's coverage of the MS Worldwide Partner Conference in Toronto in July, why did Microsoft have separate events for these announcements? Much of the media that covers Microsoft was already at both events. Why make them setup last minute travel to sites that aren’t even standard spots for Microsoft?
Let's do a quick comparison between Microsoft and Apple. Microsoft has a large development conference called Build and they announced Windows 8 at it. Apple has their World-Wide Developer Conference in San Francisco each June and announce their new iOS or even Mac OS. Microsoft has its TechEd conference in June as well, a perfect time to make some announcements about certain technologies like Exchange, SharePoint, Lync but hold that until an Office announcement. Okay, I understand that and wanting to keep the message together. But then the next obvious place to announce is the WPC. Who best to announce this with as your "rabid audience" than partners looking to sell these solutions? But no, Microsoft does not take advantage of this "home turf". In fact, they do two separate events with little preparation forcing the tech media to jump if they want to cover it. This sounds so much like Apple and how they do their announcements through the year away from WWDC.
I like that Microsoft is jumping out and getting into the spotlight. I like that they are being mysterious and capable of it. The Microsoft I worked at was so full of holes, information leaked out like a sieve. The Surface announcement was a great notion but left me with so many questions. Why have it in LA and not Seattle? Why did you do an announcement of that type at a last minute event instead of using something like TechEd? Since you did not let the media be hands on, why not show it off at TechEd. Then, the media would not be the only ones in the room at the announcement. Surface would have gotten a huge standing ovation at TechEd with possible Seattle Sounder like chanting in the keynote. You leave many a TechEd fan and attendee crushed seeing this important announcement and not giving your most important user base no chance to look at the next step for Microsoft.
With the Office announcement, Microsoft did it again. They have a chance to announce something big at their major annual partner event but pass again. They have a chance to announce it on their home turf and make the tech press come to them, but pass on that again going to them in the Bay Area. What does this say? This smacks of letting the tech press say that the center of the tech world is San Francisco and the Silicon Valley. If you want to release something, you must come to us. With the size of both these messages and the baseline of support, Microsoft should either keep these announcements on their home turf or in front of their home crowd. That is what you learn from Apple in this case; Apple has its announcements in front of its developers. Use your home crowd for your announcements Microsoft and don't be scared to invite media to your events.
What do you think of Office 2013? What do you think of Microsoft's announcements and styles? Should they have kept them separate from the events or brought the announcements to their planned events? Leave a comment below.
Jared
Footnote:
1 – Office history courtesy of http://www.intowindows.com/microsoft-office-history-in-brief/
The Future of Managing Microsoft Products - PowerShell
One key message that I got from both Microsoft TechEd's that I have attended was around how Microsoft plans for its products to be managed: PowerShell. Some might say that the "grey beards" have infiltrated Redmond and put a CLI management system into the products. With the planned release of PowerShell v3.0 in Windows Server 2012, PowerShell becomes the key skill for IT Professionals to learn. If they do not, they might be left behind.
I remember building Visual Basic scripts (VBScript) to execute scheduled jobs and perform maintenance tasks over the years. I built them on Windows Server 2000 and Server 2003. I also remember looking at Linux and its BASH environment for script and management of the systems. I loved the idea of a scripting language that was easy to remote execute and manage systems from, having more access to the system without need for add-ons, and ability to run live at the command line. VBScript did not offer this to me on Windows. Yes, I could build/buy/find COM objects to give me access to other parts of the system within VBScript, but I kept looking envious at BASH.
Fast forward to 2006 and the tooting of the PowerShell horns. PowerShell v1 was released for Windows XP/2003 series systems and included within Windows Vista. I played with it initially but it never caught on for me. I could see its future and hoped for more. Waiting for V2.0 of PowerShell that was released with Windows 7 and Windows Server 2008 R2, I started getting impressed with it. I dabbled more and more with it but it was still a dabble. I was confused by much of it and did not invest the time I should have. (Dates verified thanks to the WikiPedia article on Windows PowerShell)
Now, on the cusp of the Windows 8 and Server 2012 releases, I am re-energized by the thought of PowerShell again. In reviewing Microsoft's technologies and researching this piece on PowerShell, I learned one key thing: Microsoft is betting heavily on PowerShell. Many of their products management utilities are now just a GUI on top of PowerShell scripting. Some key examples of this is Exchange Server 2007 and 2010; Lync Server 2010; System Center 2007 (Virtual Machine Manager), 2010 and 2012; and SharePoint 2010. All of these GUI's execute PowerShell scripts behind the GUI. In many cases, most of the management consoles can do most of the management, but the GUI does not have all of the access that the PowerShell commandlets have. If an IT Professional utilized PowerShell at the command line, they could do more than the management console allowed.
With the advances of PowerShell v3, I am looking into how to learn more. There are a few good resources online such as:
TechNet - http://technet.microsoft.com/en-us/library/bb978526.aspx
PowerShell.com - http://powershell.com/
PowerShell Blog - http://blogs.msdn.com/b/powershell/
Scripting Guy - http://technet.microsoft.com/en-us/scriptcenter/bb410849.aspx
CodePlex - http://codeplex.com
Another way to learn more is attending conferences like TechEd or other educational opportunities. One person to look for as a trainer is Don Jones, a multi-year MVP in the PowerShell technology, as his sessions tend to fill quickly. Don also sells a book meant to learn PowerShell basics called "Learn Windows PowerShell in a Month of Lunches". My intent is to get this book and spend the my time learning PowerShell as I feel all IT Professionals should. I am also going to encourage my staff at my job to do so as well.
What do you think of PowerShell? Are you going to spend time learning it?
Jared
Impact of Windows Server 2012 Licensing
Thanks to Mary Jo Foley of ZDNet/CNet, we have finally heard about Microsoft's licensing plans for the new version of Windows Server 2012 (codenamed "Windows Server 8") to be released this fall. In her article here, she covers the crux of the licensing announcement made by Microsoft but I want to look in depth at it a bit.
When you review the article by Mary Jo Foley, you can start to see some of Microsoft's next plays and whom they are going after with their pricing model and their offerings. As she says in her article:
The four SKUs are Foundation (available to OEMs only); Essentials; Standard and Datacenter. The Essentials SKU is for small/mid-size businesses and is limited to 25 users. The Standard and Datacenter SKUs round out the line-up. The former Windows Server Enterprise SKU is gone from the set of offered options.
Microsoft is removing a few of the SKU's. This includes the Enterprise SKU which was a step between Standard and Enterprise in the 2008/2008 R2 licensing model, the HPC (High Power Cluster) SKU meant for folks doing large scale computing and modeling like scientists or researchers, and the Small Business Server SKU meant for small companies as a bundle. Out of all of these SKU's, I think the biggest loss for most consumers is the Small Business Server one.
Small businesses do not have large capital to drop in large IT systems. As such, they find what they can to fit into their budget but tend to fall back on lower cost or free software to fill in the gaps. When I have worked with small business owners in the past, many had their teenager kids "build them a server" and install a Linux variant on it. The child goes off to school leaving the business to suffer with a server that cannot be updated for either features or security. In my history, about 30-40% of my consulting calls were this exact scenario.
To remedy, I would help them find a server that did cost more but gave them more bang for their buck. In many cases, it would be a commodity server of some sort running Microsoft Small Business Server. It gave the business owner something familiar for them in Windows, but also some more advanced offerings like Exchange and SQL Server. This gave them the ability to run their own messaging and calendaring server in Exchange and higher-end database server in SQL Server. They could buy software that needed one or the other to work to give them a competitive advantage against others that did not have these options. All in all, the Small Business Server was one of the better ideas that Microsoft came up with.
With the V2 release of Windows Home Server, Microsoft also released a Small Business Server related to the Home Server. This was a continuation of the Small Business Server with the Windows Home Server GUI placed on it. It offered easy AD creation, integration with Office 365, and a Premium add-in that gave the business Exchange and SQL on-premise versus only in the cloud. When I saw this offering, I was thrilled for small business owners. This could have been the "Small Business Server Appliance" operating system that could steamroll the market. After its release, all I did hear was crickets chirping and the deafening silence; the product never got off the ground.
Fast forward to Windows Server 2012 and no more Small Business Server SKU announcements today. For a business to replicate this offering,they will need to licensed either Essentials or Standard edition based on if they have more than 25 users. Then, the business will either have to license Exchange 2010/2013 (when released) or get their Exchange offering through Office 365. For the SQL services, the business could use the Express version of SQL for free but be limited by its connections/licensing model or purchase a larger copy of SQL. (For more information on 2012 SQL Licensing, check out the "Features Supported by the Editions of SQL Server 2012" page in MSDN.)
This is much more expensive than the Small Business Server model offered and will cause many small business to go back and rethink their IT strategy. In this one stroke, Microsoft may re-open the door for free packages like Linux and MySQL or have businesses using desktop operating systems as servers. Someone in Redmond needs to really look at this and remember that the small business is a large market for them. Don't just hand it over to the competition.
Moving My VM's from 2008 R2 to 2012 RC
If you read my prior post about my new Hyper-V rig, I wanted to get some of my current systems over there running so I can see the performance differences. I also wanted to get those VMs off my 2008 R2 Hyper-V host to possibly upgrade it to 2012 RC as well so I can do some clustering between them. While there are many ways to do this, I did my moves the complete manual way. Kids, do not try this at home …
With my brand spanking new rig, I wanted to get some VM's running on there other than a couple I spun up to test the disk I/O and the memory allocation. I wanted to get some real load on the box. To do that, I needed to move some of my other machines from my 2008 R2 Hyper-V host and convert them to run on 2012. There are many ways to accomplish this with many tools but I chose the worst of all routes to ensure that the systems converted just fine.
As all of the servers needed their monthly patches, first thing I did was patch the servers. Yeah, much like many IT folks, my servers needed their patches installed. Similar to the statements around the shoes of kids of cobblers and the homes of carpenters, IT people do not always practice what we preach. One thing I hope to setup with the additional hardware and space is a patching system to do it for me. Also, since I was patching the VM's on my 2008 R2 host with bad disk I/O, this process was very long as I had to patch each machine individually.
As each server finished its patching, instead of rebooting the box fully, I just shut down each box. I then copied the VHD file from the 2008 R2 host over to the 2012 host and performed a hard drive conversion to VHDX. For my web servers that connect to a file server for configurations and website content, I turned on the VM on the 2008 R2 host while this conversion was completed.
As I have 2 web servers, the copy took about 5 minutes and the conversion about 20 minutes for each web server. By turning on the web servers back on the 2008 R2 host, I could work on the conversions and setup on the 2012 server while my sites still operated. Once all was set on the new VM's hosted by 2012, I had the NIC settings to change the IP of the server's NIC ready to hit "OK" and shut down the 2008 R2 host's VM. Downtime was about 10 seconds on one server and about a minute on the other. The only reason why the downtime was longer on the second server was my fat finger on the gateway address for the server. With that the web servers are moved to my 2012 server.
The next of my servers I chose to move was my Exchange 2010 server. In my lab, I was running a single-server installation (waiting now for all my friends to give me hell for this installation) and wanted to move it over to my new 2012 system as well. As the only mail going through the server was my own addresses, taking it offline was not an issue. After it completed its updates, I moved the VHD over to the new 2012 host and started its conversion but did not restart its old VM back on the 2008 R2 host. Once the conversion was done, I created the new 2012 VM attaching the freshly converted VHDX and brought it online.
With installation of the new Hyper-V Integration tools and drivers, all of the VM's were brought up to 2012 hosting standards. Currently, I am running only 2 VM's on my 2008 R2 hosting system. The two left are my SQL 2008 and my Windows Home Server 2011 systems. In reviewing my process for moving my systems, I could easily do the same steps to the WHS 2011 image except that the backend iSCSI connection is provided by my DroboPro. I need to refigure that out but I am not happy with my current WHS configuration. I do not feel it offers me anymore advantages as a single device as I don't use its remote access capability nor its media streaming. It has ended up a file server and I would prefer to run a real file server with Windows 2012 instead. As far as the SQL 2008 server, I plan to build a new one on the 2012 host and mirror the databases across, then use the 2012 hosted VM as my primary.
Many might say I am taking a big risk in that there is no knowledge of an upgrade path from 2012 RC to 2012 RTM. I have also done something most might say was dumb in moving the VHD's manually. I feel the risk/reward of building new VM's with the VHD from the 2008 R2 host versus installation of tooling that either would take a long time versus doing what I did. If Microsoft allowed 2012 Hyper-V tools to work with 2008 R2 servers, I might have used some of the other tools but that will not happen by the looks of things. I have 3 of my main servers over on the new host and looking pretty damn good.
Next, I will be working on both the SQL and WHS/File server. I will post more about that work when I get to it.