There is a great wave in the MacAdmin community around imaging computers called Modular Imaging.
I am by NO MEANS early to this party; these Mac deployment techniques have been around for quite a while. I am just hoping to bring some light on the subject for those who haven’t quite grasped it yet.
Also known as Thin Imaging (depending on your approach/technique they can be slightly different) the basic concept of Modular Imaging is that you are applying the preferences, apps, and other elements of a customised configuration to a machine in layers. As elements of this configuration change, its very easy to update components as opposed to building a whole new customised setup.
Hold On – Whats Imaging!?
So if you were like me a few years ago, you were a happy go lucky consultant who sold Macs to businesses and home users. You were sometimes lucky enough to get paid to go visit these poor souls and be an “IT Guy” for a couple hours while you used your above average awareness of technology to bedazzle them into making their Mac do what they wanted. If they had 3 or 4 Macs to setup, then you went through a process of turning it on, creating a new user from scratch, installing some software from a CD/DVD, setting some preferences, fought with a piece of software that didn’t want to play nice and (if you were smart) wrote it all down so you could do the steps again verbatim 3 more times over…
That was great… those were good days… You billed a lot of time for the work and the customer was happy everything was handled with a fine tooth comb and seemed to work well. The issue though is as they got more machines, and the company grew, their experience was a little different on each machine, they started having different Mac OS X versions and things felt a little “inconsistent”. You also got a little tired of doing the same process on 5 or 10 different machines… if only there was a way to automate it…
*Rides in on a white unicorn*
Building An Image
See, if you were a crafty little Mac Consultant, you knew about cloning and how a Mac is well able to clone hard drive contents and operating systems from one volume/computer to another. If you replaced the hard drive in a White 2008 MacBook from a 120GB 5400rpm drive to 500GB 7200rpm drive you could just restore/clone the contents from one to the other over FireWire 400 and in a couple hours the computer would “Just Work” like it did before except quicker and with more hard drive space! No re-install or re-setup of an operating system required! Yay!
The concept of imaging built on this principle but realised that if I bake in a core set of users, settings, and apps into a Mac I am setting up, I can save this “Macintosh HD” volume with all of the included hard work to a Disk Image (DMG) file and then restore/copy it on to other Macs so that they all have the same content but I only have to do the “hard” work ONCE. I could then run around with a bootable hard drive containing the DMG and restore it on to as many Macs as I wanted; brilliant!
The Next Step: NetBoot
Apple invented a wonderful technology (in 1999 according to Wikipedia) called NetBoot. This allows a Mac to startup from a system that is not physically living on its own hardware. In otherwise, a bootable system living somewhere on the local network can be used to start up the Mac instead of using the operating system on its internal hard drive.
A variation of NetBoot called NetInstall was also created which means a remote startup system can be used not only to run the local Mac but also install a new/different system on to it. This means you can have a specific Mac operating system installer prepared on a Mac server and start a new mac up using NetBoot to install it. Brilliant!
Using another variation of NetBoot, called NetRestore you can take a prebuilt Mac disk image (like we made in the section above) and put it on a Mac server restore it on to our local Mac over the network. This technology opened A LOT of doors in places like school computer labs as you could have a couple dozen Macs all with the same prebuilt Mac OS configuration in a fairly easy manner. It also meant that if a kid (or employee of a company) really screwed up a computer system, no worries! Just start the computer up form the NetRestore system and after a few minutes, voila! All back to normal. Adding more Macs into the lab? No worries as NetRestore was to the rescue…
And It Was Born: Monolithic Imaging
So, if Modular Imaging is the new cool way, what is the old way called? Monolithic or Thick Imaging
Opposite of Modular = Monolithic
Opposite of Thin = Thick
Ok, now that we have that covered, let me explain the photo above…
Monolithic imaging is the process of creating an SOE (Standard Operating Environment) by baking in a bunch of core users, settings, and apps (sound familiar?) into a Disk Image (also known just as “image”) and deploying it to a bunch of computers. The issue with this process is you wind up with this bloated image file that you need to restore on to erased computers every time you want to use it and its NOT very agile or flexible to make changes (now the photo starts to make sense…).
In a Monolithic imaging process you need to make a new copy of the image every time you make a change to preferences, update software, etc. As an example, if you have a full Adobe Master Collection installed in an SOE image for a school lab you could be dealing with a 80GB or 90Gb file that needs to be created and compressed every time you make changes; that is a huge amount of pain and time just in creating image files. If you get a new set of Macs for a school that need to run the latest macOS but your SOE has been built for an older version it takes a lot of work to get it ready for the new machines. Again, not an ideal scenario and because your SOE in a lot of cases wasn’t freshly made. It was often upgraded and adapted a few times to get it to the point it was needed making it a bit of a Frankenstein.
Profiles and Packages
So a time came for me in my journey to MacAdmin-dom (MacAdminhood?) when I was baking in some clever tools into my grand Monolithic/SOE images and deployment technics. I was using tools like DeployStudio (we’ll get to that) to deploy my images over the network to Macs with various packages rolled in on the fly to update/adjust my pre-made images; this made sure I didn’t have to re-create my image every time I need to make changes, just for some things.
I figured out ways to add preference files and other content to User Folder templates so that when network directory users logged into a Mac for the first time they had settings already set for their 1st launch of Microsoft word. I used configuration profiles made by Apple’s Profile Manager to set dock items, energy saver preferences and other items so I wouldn’t need to bake them into my SOE… things were looking up with a world of intelligent automation at my fingertips. I was using layering technics to add on top of a pre-baked image to alter its behaviour on the fly…
AutoDMG: No More Waiting For Installers
GAME CHANGER! Ok this is where things really got interesting. Here is a tool developed by Swedish mastermind Per Olofsson which could take a macOS Installer App and create a fresh DMG file of that OS, PLUS, include some of your favourite Apps/Packages to create an SOE; how cool!
All we had to do was drag the installer App on to a screen, drag a few packages and voila! We had an image with pre-baked stuff we could deploy. Gone were the days turning on a machine, creating a user, installing some stuff, installing some more stuff, waiting for that to finish to install some more stuff…..
We could use AutoDMG to create an image and either deploy it using a NetBoot style function or stick it on a hard drive and run around doing a Apple Disk Utility restore or Carbon Copy Cloner clones….
And Then It Hit Me…
If I use AutoDMG to make completely blank macOS images and then roll in everything else dynamically as their own “layers” we get a VERY flexible and easily changeable deployment system that evolves as the environment does. New Macs? No problem. New Macs that need a different operating system? No problem, insert a new AutoDMG image. Want different deployment sets for different departments that all use the same core elements but build different apps per department? No problem….
ZERO time invested into building new DMG files for deployment, rather, 100% of my time spent into building cool logic into deployment workflows with intelligent packages and scripts to customise the Mac being deployed.
I was seeing the fruits of modular elements in my packages and scripts for my SOE deployments; I was hearing the terms Thin Imaging and Modular imaging being thrown around on the web but hadn’t seen it first hand.
Once I connected the dots… my own Modular Imaging workflows were born
True Thin Imaging: Profiles & Packages
Wait, didn’t I already read this section!?
No, no you didn’t…. well sort of 🙂
So remember when I said Thin Imaging is the same as Modular Imaging? Well its ALMOST the same, except Thin Imaging is ONE STEP FURTHER.
Thin Imaging basically takes all of the workflows you developed for Modular Imaging and removes 1 element: erase the machine and restore a base image. In true thin imaging you do not need to create a fresh OS with AutoDMG; instead, you install profiles and packages on the fresh (or used) macOS that came with the computer you are “imaging”.
I like to think of true Thin Imaging more as “Provisioning” as you are applying apps and settings to whats already there rather than wiping it and giving it a fresh start. Of course, the end goal is the same: a Mac with the specific functionality the end user needs to get stuff done (and hopefully enjoy it :))
A Chicago based service call Robot Cloud is a great example of Thin Imaging in action with a tool called JAMF Pro (formerly known as Casper Suite) at its core.
With services like Apple’s DEP (Device Enrolment Program) and VPP (Volume Purchase Programme) devices can hook into a management system at point of activation and start being configured with assigned settings, apps, and even device specific licensing meaning the hands on touch required from IT is very minimal (often non-existent); this is the future of device deployment.
Modular Example
“OKAY…. I AM INSPIRED…”
But now we need examples 🙂
Here is a modular workflow example I built for a customer using DeployStudio
The Structure
The server infrastructure is 3 components:
- The DeployStudio Server software installed on a Mac Mini (running Server App) as the brains of the operation
- The DSS (DeployStudio Server) repository living on a Synology NAS (could live on same server as above)
- The netboot/netinstall services running on the Mac Mini with Server App to allow clients to option+N boot and start up from either a 10.10.5 or 10.11.6 netboot image for deployment
The Workflows
In this example there are 3 workflows developed:
- eCommerce Restore
- General Restore
- Video Edit Restore
The workflows have been setup in a modular way so that over time as the deployment requirements evolve its easy to add to the exisiting workflows. They currently represent 2 department specific workflows and 1 general workflow.
Sub Workflows
In order to achieve a simple layering structure, the workflows above actually have no smarts. Each workflow has an Alert to advise whats going to happen but then has nested workflows to do all of the work. This way if we want to update the OS version used on ALL workflows we edit it in one place, not multiple.
Each top level workflow has the same 3 core workflows and then the others have anything specific to the type of machine.
Core OS Restore does the following:
- Asks for hostname/computer name
- Asks for user accounts to create
- Erases/Partitions the boot drive
- Restores the fresh/empty macOS image
- Configures the items requested earlier (hostname, user account, etc)
Core OS Config does the following:
- Installs a predefined admin user account
- Installs monitoring software
- Installs remote support software
- Installs a configuration profile for managed settings
- Copies user login script files
- Copies custom desktop background and custom user logo
Core Apps Install does the following:
- Installs common apps and plugins like Adobe Flash, Numbers, Pages, Keynote, Firefox, Chrome, and Microsoft Office
If you need to update the version of OS used, use our trusty tool AutoDMG and just replace the OS image in the “__1 Core OS Restore” workflow which is used by all 3 public facing workflows.
If you want to update the version of flash used (with a package created by AutoPKG) then you only need to replace it into the “__3 Core Apps Install” workflow which is used by all 3 public facing workflows.
Making New Workflows
If you want to make a new workflow (i.e. for the Retail department), then you would copy with the “*eCommerce OS Restore” and the “_eCommerce Apps Install” workflows, rename them to Retail appropriate and make sure the top level references the bottom level along with the other 3 core sub workflows.
Automating Your Packages
As mentioned above you could improve on this example setup by implementing a more ongoing and automated app deployment suite like Munki, AutoPKG, and AutoPKGr. This could download regular app updates and deploy them automatically for both new and existing machines. I will write separate blog posts about these tools in the future but Munki-In-A-Box is a great way to get jump started with them.
Make This “Thin”
If you want to do all of this without restoring a fresh OS and just use what the machines already have then you’d make some minor tweaks like removing the re-partition and restore parts of the Core OS Restore workflow
Benediction
Hopefully this post has been entertaining and somewhat enlightening. If you have any questions or comments feel free to enter them into the form below.
[contact-form][contact-field label=’Name’ type=’name’ required=’1’/][contact-field label=’Email’ type=’email’ required=’1’/][contact-field label=’Website’ type=’url’/][contact-field label=’Comment’ type=’textarea’ required=’1’/][/contact-form]