Archive for the ‘Microsoft Deployment Toolkit’ Category
A customer earlier in the week had implemented ConfigMgr for their builds and was getting good results with it. They hadn’t implemented MDT as they couldn’t see the benefit, so with this series of posts I’m going to highlight why we mostly do it this way, and what benefits using MDT Task Sequences brings.
Now that SP2 for ConfigMgr is in Release Candidate (and due for RTM at the end of October according to Mr Niehaus) we can use this stuff for Windows 7 deployment.
First up, install SQL, ConfigMgr, its dependencies, and MDT 2010 RTM.
Now, integrate MDT with ConfigMgr by clicking
Now open the ConfigMgr console. nothing much has changed, but you have a couple of new options when you right click in the OS Deployment node. You can create MDT Boot Media clicking in boot images and you can create an MDT Task Sequence clicking in Task Sequences, let’s do that now!
When we do this we are prompted to pick a template. So, here’s the first benefit with MDT. More pre-configured templates:
The standard ConfigMgr task sequence only gives three options:
Further to that, the ConfigMgr standard Task Sequence expects you to have set up all of the dependent packages, boot images, etc. yourself before running through the wizard. MDT will create them for you if required…
Ok, so we pick to deploy the MDT Client Task Sequence, I can already tell that this is going to be a rambling post, but the first thing of note is that you no longer have to provide a capture destination if you’re not going to be doing a capture. Hurray, a minor irritant squashed (it’s the little things…)!
That said, I need to capture this build, so I fill in the box.
With the standard ConfigMgr task sequence, I’d need to select one of the pre-built boot images, but I want the goodness of ADO and other brilliant, so I get MDT to make a special one just for me:
The MDT asks if you want other languages, a custom wallpaper (I always make mine in PowerPoint, some of the templates make for pretty wallpapers, when there’s all this technology around there’s still no getting away from the fact that the customers like their logos on things, and why not.). On the same screen as the wallpaper, language, ADO options etc. you can also provide an extra directory to add. I put my diag tools in here (Trace32.exe etc.) they make life easier if you have problems in PE).
I create a Deployment Toolkit File Package. This holds all the scripts and bits that the MDT task sequence needs. Those of you still with us may notice that I put everything in a sub-folder of a root folder called OSImaging. This keeps things nice and tidy as far as I’m concerned, and is something I recommend.
Now MDT wants to create our OS package for us. Again under the standard ConfigMgr task sequence you’d have to do this outside of the wizard.
It also creates the ConfigMgr client package for you. Again, it’s not hard to do yourself, but why bother when the wizard can sort you out…
and USMT package:
We don’t need Sysprep, so can skip the final screen and then we’re ready. The wizard goes off and creates all the objects listed above.
Once it’s finished we just need to add the packages created to distribution points (this includes the OS install, so it can take a little while). I’ve got a PXE Service Point, so I add my new boot image to that DP too.
Next we’ll deploy and capture this and then start to look at the clever stuff we can do with the MDT integration to streamline deployment and support advanced deployment scenarios.
The MDT Package Mapping approach is a great bit of OS deployment technology and can help to provide an excellent deployment rate when it comes to large-scale zero touch deployment projects. One limitation that currently exists is that package mapping only works in the refresh computer scenario, i.e. I’m moving from XP to Vista, when I install my Vista image, put back the applications which I used to have.
For the Replace Computer scenario, where we’re deploying new hardware, out of the box, package mapping does nothing for us.
In the ConfigMgr Replace Computer scenario we use ConfigMgr Computer Associations to provide the capability to recover the user state (via a state migration point) from the OLDCOMPUTER to the NEWCOMPUTER during the build process. We run the Replace Computer Scenario task sequence on the OLDCOMPUTER, this captures the user state, then the NEWCOMPUTER state restore phase magically recovers this. All good stuff, but it doesn’t help us with the apps…
But it can. A small change to the PackageMapping stored procedure can use the same Computer Association record to migrate the applications across machines in the same way as we migrate the user state. It’s a shame that this isn’t integrated in the ConfigMgr console as elegantly as the Computer Associations are, but it works…
First we need to import the NEWCOMPUTER into ConfigMgr using the Import Computer Information wizard. Select the OLDCOMPUTER as the Source for the NEWCOMPUTER (obviously this can be done on a large-scale by populating a CSV file with these entries).
Next we need to modify the PackageMapping process so that it runs against the MACAddress of the OLDCOMPUTER rather than the NEWCOMPUTER.
In SQL Management Studio, browse to the PackageMapping stored procedure and select to modify it. Replace the entry shown with this text (replace SMS_JON with the name of your SMS database)
set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
ALTER PROCEDURE [dbo].[RetrievePackages] @MacAddress CHAR(17) AS SET NOCOUNT ON /* Select and return all the appropriate records based on OLDCOMPUTER inventory */ SELECT * FROM PackageMapping WHERE ARPName IN ( SELECT ProdID0 FROM SMS_JON.dbo.v_GS_ADD_REMOVE_PROGRAMS a, SMS_JON.dbo.v_GS_NETWORK_ADAPTER n WHERE a.ResourceID = n.ResourceID AND MACAddress0 = (select sourceMACAddresses from SMS_JON.dbo.v_statemigration where restoreMACAddresses=@MacAddress) AND n.ResourceID IN (Select ResourceID from SMS_JON.dbo.v_R_System_Valid))
A little explanation of what is happening here:
With a thorough PackageMapping database and thorough User State Migration profile and very good planning in terms of assigning the OLDCOMPUTER and NEWCOMPUTERs through Computer Associations we can achieve a near seamless and very high performance desktop replacement approach.
The MDT Package Mapping functionality (Scenario 5: Dynamic Computer-Specific Application Installation to give it its MDT name)
provides for the automatic re-installation of ConfigMgr applications during OS re-imaging. This is done via interrogation of the ADD_REMOVE_PROGRAMS SQL view. The deployment database contains a PackageMapping database. This is used to pair an Add/Remove Program inventory entry (ARP) with a ConfigMgr package and program in this way:
The database supplied with the deployment database does not contain the Comment and ID tables shown above, but I’m finding these are pretty useful. Comments are handy as the database gets larger and the ID field is used as a primary key on the table so that we can update the database from a Microsoft Access front end.
Populating the database with new mapping entries is somewhat error-prone… The ARP entries are often GUIDs, and the Packages entry requires the package ID and exact program name. Any of these entered incorrectly results in a failed build which can be awkward to troubleshoot. The only minor complexity on this is that the ID table should be configured as IsIdentity (AutoNumber) in SQL:
To reduce the risk of incorrect entries being added I have knocked up a quick MS Access form to allow selection of the ARP name and Install Package entry. To build this all we need are a couple of ODBC entries one "PackageMapping" connected to the DEPLOYMENT database and another "ARPData" connected to the ConfigMgr database.
Now we just need some linked tables in Access.
The PackageMapping table is from the DEPLOYMENT database.
The others are SQL views from ConfigMgr
The field highlighted in red builds the MappingEntry field to
This structure allows us to select the PackageID from a list of all Packages, but displays the friendly package name.
The MDT Package Mapping functionality is an excellent way of providing application migration during zero touch OS deployment. Coupled with a decent USMT configuration you can get some really good results to provide a seamless OS refresh.
We’re using Package Mapping extensively on a current Zero-Touch project and have come across some minor issues related to obsolete resource records.
For example, machine PC001 has ten applications which are in-scope for package mapping (i.e. the Add/Remove Program (ARP) entries they have are mapped to ConfigMgr packages in the MDT PackageMapping table). PC001 has a problem and is rebuilt using the legacy Ghost approach by IS support. Of the ten applications which were installed previously, only three are actually required by the user, so only these three are replaced.
A couple of months go by… We’re finally in a position to deploy our new ZTI image to PC001, the install goes fine, but ten apps are re-installed to the machine when the image is installed. Checking the ARP record in Resource Explorer for the machine previously revealed only three in-scope apps, so what’s happened?
Pretty straightforward stuff, the stored procedure used to populate the PackageMapping entries (PACKAGESxxx) in the image installation task sequence does a lookup for the ConfigMgr ResourceID using the host machine’s MAC Address; this ResourceID is then used to interrogate the ARP entries table for all apps at last hardware inventory cycle. The problem is that when the query for the ResourceID is executed, it gets two results, the one from before the machine was re-Ghosted and the new record created when the machine re-joined the infrastructure. The old record will ultimately be aged out of the database (after 90 days) but in the mean time it’s hanging around, even though the old resource has been removed from the Admin Console.
The consequence of this is that the obsolete ResourceID is used by the PackageMapping process and thus the wrong apps are re-installed to the client. The same problem occurs in the lab when re-imaging test machines; even when the test machine’s record is deleted from the admin console, its inventory data remains and will be used by the PackageMapping process.
The fix for this is to modify the MDT RetrievePackages stored procedure to validate the ResourceID against the new ConfigMgr R_System_Valid view.
To do this using SQL Server Management Studio browse to the stored procedure, right click – modify, then add the following to the end of the supplied code:
AND n.ResourceID IN (Select ResourceID from SMS_xxx.dbo.v_R_System_Valid )
Alternatively, replace the provided stored procedure by executing:
if exists (select * from dbo.sysobjects where id = object_id(N’[dbo].[RetrievePackages]‘) and OBJECTPROPERTY(id, N’IsProcedure’) = 1)
drop procedure [dbo].[RetrievePackages]
CREATE PROCEDURE [dbo].[RetrievePackages]
SET NOCOUNT ON
/* Select and return all the appropriate records based on current inventory */
SELECT * FROM PackageMapping
WHERE ARPName IN
SELECT ProdID0 FROM SMS_xxx.dbo.v_GS_ADD_REMOVE_PROGRAMS a, SMS_xxx.dbo.v_GS_NETWORK_ADAPTER n
WHERE a.ResourceID = n.ResourceID AND
MACAddress0 = @MacAddress
AND n.ResourceID IN (Select ResourceID from SMS_xxx.dbo.v_R_System_Valid ))
Now when Package Mapping is called, only non-obsolete data will be used and everything will function as expected. This has the added advantage in a lab of allowing you to delete the resource record of a test PC from the console and have that machine return no results from Package Mapping.
Thanks to John Nelson for his help with this on MyITForum.
We’ve had some problems over the past couple of weeks with our RTM Windows 7 rollout. Everything was going fine and then the gremlins seem to creep in and suddenly the install of Windows 7 would just hang at the “Completing Installation” phase of the image install. This didn’t appear to be anything to do with MDT as Windows is off on its own at this point.
I eventually realised that I’d made a mess of the driver management in the MDT Workbench. We have been using MDT to build both XP and Windows 7 whilst Windows 7 was in Beta and Release Candidate. We’re now ending our XP deployments, but are keeping the build in the workbench for now for legacy support reasons (and VM builds…). We have a massive Windows XP driver repository which, in truth, we could have managed better. Windows 7 on the other hand is fairly tidy as we need very few drivers at the moment (over 89% of the devices on our network are supported on the Windows 7 RTM media according to our MAP assessments). I’d separated XP and Windows 7 drivers out into separate folders in the workbench:
But this was as far as my separation had got. Quite how I thought MDT was going to sort through all this I’m not sure… Anyway, the point of this post is that we can now use MDT’s new Selection Profiles to manage which drivers are added to which machines in the Task Sequence. There’s a couple of things we have to do…
First, create a folder structure for your drivers such as that above (I also split drivers out by machine model below the OS folder for better management; in large environments this is crucial).
Next Create a selection profile per OS:
Each selection profile allows you to select the driver folder you want to be included
Now in the task sequence for Windows 7 we can force the Apply Drivers task to only reference the “Windows 7 Drivers” selection profile:
Now when the install of the image runs, we only attempt to inject Windows 7 drivers. It speeds up the build process and fixes the hanging issue…
MDT 2010 RTM’d yesterday, so when better to install?
RTM is version 5.0.1641.0
It (naturally) wants the release version of WAIK 3.0:
Once the old WAIK is removed and the new WAIK installed:
Sort out the wimgapi.dll clash and we’re away… (well, almost)
Got to upgrade the deployment share again (this is the third time, from beta to RC to release. Nice job that the deployment team support upgrade this far, it’s saved us recreating anything every time…)
Once that’s done we’ll update the deployment share to update our boot images etc: The update process detects that there’s a new WAIK version, so opts to update the boot image
Following this and copying the updated boot image to our WDS server, we’re off building machines again!
In a fit of perhaps unwarranted optimism we have initiated a live pilot of Windows 7 for a bunch of users in our company environment. There are a number of good reasons for this which we laid out in an invitation flyer to the volunteers/victims, i.e.:
- Expose internal systems and consultancy teams to the latest Microsoft desktop deployment technologies and operating system.
- Provide backwards-compatibility with existing in-use OS (Win XP) to allow a single approach for all internal OS imaging tasks.
- Provide front office staff with exposure to the new Microsoft OS and collect data on the real-world performance.
- Implement Microsoft Application Virtualization to provide a rapid application provisioning platform.
We are running this as we would for a live customer, so have completed a costed proposal and scoping exercise and are implementing the technologies involved in a structured and managed way. We’re taking the opportunity to implement App-V along with the Windows 7 pilot. The logic behind this is:
- App-V is great. It’s an excellent application provisioning solution and will deliver a real benefit to our support department from day one.
- The Windows 7 pilot will result in the building and re-building of PCs throughout the pilot process. If we don’t have to worry about the apps (as we’ll stream them to AD groups) this makes our rebuild process more efficient and supportable.
- We’re implementing a new VDi training environment at the same time, App-V is a great solution for this environment too, the apps are the same in both environments, once sequenced the apps will be reusable in both environments.
We’re using the Microsoft Deployment Toolkit (of course) v. 2010 Beta 1. This is a technology we’re very familiar with as we use it for our customers (normally integrated with ConfigMgr). In this environment we’re using Lite Touch deployment (ZTI isn’t currently supported and we don’t have ConfigMgr deployed anyway).
Under MDT 2010 we’ve been able to create and capture a reference image with our core apps in the usual way, but the deployment of this image does not function correctly under Beta 1 of the MDT. The main problems are that the image build doesn’t join the domain or process the State Restore phase of the build, so no apps get deployed. Along with this, as the local administrator is disabled by default in Windows 7 we have to use SIM to add an additional local admin account, otherwise we can’t log onto the newly installed image.
Other than these little irritations it works, but isn’t yet fully deployable, so we’re holding off for MDT 2010 Beta 2 which is due out any time now…
We’ve currently tested out Lite-Touch New Computer scenarios and the Refresh Computer scenarios which work (above issues aside) well. USMT 4 in particular is a very interesting piece of kit and is going to make a huge difference to our deployment time some of our users have over 100GB local data!!!).
Task Sequence Tips and Tricks
We’re taking the opportunity to try out some Gucci little tips and tricks during this build to pretty up the process, things which we don’t normally have the time or scope for with customer engagements.
Mainly these are little bits of cosmetics, the first being minimizing the startnet.cmd at runtime using David Clarke’s rather marvellous ConsoleSize. To get this in place is slightly arduous, you have to add it to PE and modify startnet.cmd to run it before winpeinit, which involves mounting the WAIK copy of winpe.wim (this is then used by MDT to build our custom PE wim/iso… It works, and means that we can reap the benefit of the next little tweak:
Use Custom Backgrounds to Illuminate the Build Progress
At this year’s MMS, The Microsoft Modena Team demonstrated using BGInfo and a number of custom backgrounds to brand up the build process. Essentially this calls BGinfo at the start of critical phases of the build to repaint the wallpaper and tell the engineer/end user where the build is up to. This looked pretty cool, so we’ve put it in place in our build process.
As BGInfo is called, it replaced wallpaper1 with wallpaper2, ..3..4 etc, giving the appearance of the progress list updating.
We use BGInfo to provision bits of information about the machine (RAM, IP etc.) on the bottom left also.
It’s very easy to implement and makes for an attractive build process!
Throughout the pilot we’re providing the participants with regular project flyers detailing some of the new features of Windows 7 which they might want to try out that aren’t immediately obvious (Aero Shake for example…).
We’ll be collecting feedback on these features to get some real-world insight into which features of 7 are delivering benefit to the non-tech users we have.
I’ll be adding in further postings as the project progresses, particularly around App-V, USMT and of course Windows 7 itself.