VCAP7-DTM Design Exam Passed

On 21 October I took my first shot to pass the VCAP7-DTM Design exam and failed as you already know from my this article. Today I am happy to share that I finally passed the exam! 🙂

What did I do with the last information and notes I had about my weaknesses from the last exam score report? I read a lot additional VMware documents and guides about:

  • Integrating Airwatch and VMware Identity Manager (vIDM)
  • Cloud Pod Architecture
  • PCoIP/Blast Display Protocol
  • VMware Identity Manager
  • vSAN 6.2 Essentials from Cormac Hogan and Duncan Epping
  • Horizon Apps (RDSH Pools)
  • Database Requirements
  • Firewall Ports
  • vRealize Operations for Horizon
  • Composer
  • Horizon Security
  • App Volumes & ThinApp
  • Workspace ONE Architecture (SaaS & on-premises)
  • Unified Access Gateway
  • VDI Design Guide from Johan van Amersfoort

Today, I had a few different questions during the exam but reading more PDFs about the above mentioned topics helped me to pass, as it seems. In addition to that, I attended a Digital Workspace Livefire Architecture & Design training which is available for VMware employees and partners. The focus of this training was not only about designing a Horizon architecture, but also about VMware’s EUC design methodology.

If you have the option to attend classroom trainings, then I would recommend the following:

I had two things I struggled with during the exam. Sometimes the questions were not clear enough and I made assumptions what it could mean and that the exam is based on Horizon 7.2 and other old product versions of the Horizon suite:

  • VMware Identity Manager 2.8
  • App Volumes 2.12
  • User Environment Manager 9.1
  • ThinApp 5.1
  • Unified Access Gateway 2.9
  • vSAN 6.2
  • vSphere 6.5
  • vRealize Operations 6.4
  • Mirage 5.x

But maybe it’s only me since I have almost no hands-on experience with Horizon, none with Workspace ONE and in addition to that I’m only 7 months with VMware now. 🙂

It is time for an update, but VMware announced already that they are publishing a new design exam version called VCAP7-DTM 2019 next year.

What about VCIX7-DTM?

 In part 2 of my VCAP7-DTM Design exam blog series I mentioned this:

Since no VCAP7-DTM Deploy exam is available and it’s not clear yet when this exam will be published, you only need the VCAP7-DTM Design certification to earn the VCIX7-DTM status. I have got this information from VMware certification.

This information is not correct, sorry. VMware certification pulled their statement back and provided the information that you need to pass the VCAP6-DTM Deploy exam, as long as no VCAP7-DTM Deploy is available, to earn the VCIX7-DTM badge.

I don’t know yet if I want to pursue the VCIX7-DTM certification and will think about it when the deploy exam for Horizon 7 is available.

What’s next?

Hm… I am going to spend more time again with my family and will use some of my 3 weeks vacation time to assemble and install my new home lab.

Then I also have a few ideas for topics to write about, like:

  • Multi-Domain and Trust with Horizon 7.x
  • Linux VDI Basics with Horizon 7.x
  • SD-WAN for Horizon 7.x
  • NSX Load Balancing for Horizon 7.x

These are only a few of my list, but let’s see if I really find the time to write a few article. 

In regards to certification I think I continue with these exams:

This has no priority for now and can wait until next year! Or…I could try the VDP-DW 2018 since I have vacation. Let’s see 😀

New Supermicro Home Lab

For a few years I ve been using three Intel NUC Skull Canyon (NUC6i7KYK) mini PCs for my home lab. Each NUC is equipped with the following:

  • 6th Gen Intel i7-6770HQ processor with Intel Iris Pro graphics
  • 2x 16GB Kingston Value RAM DDR4-2133
  • 2x 500GB Samsung 960 EVO NVMe M.2
  • 1x Transcend JetFlash 710S USB boot device

These small computers were nice in terms of space, but are limited to 32GB RAM, have only 1 network interface and no separate management interface.

This was enough and acceptable when I worked with XenServer, used local storage and just had to validate XenDesktop/XenApp configurations and designs during my time as Citrix consultant.

When I started to replace XenServer with ESXi and created a 3-node vSAN cluster for my first Horizon 7 environment, all was running fine at the beginning. But after while I had strange issues doing vMotions, OS installations, VCSA or ESXi upgrades.

So, I thought it’s time build a “real” home lab and was looking for ideas. After doing some research and talking to my colleague Erik Bussink, it was clear for me that I have to build my computing nodes based on a Supermicro mainboard. As you may know, the Skull Canyons are not that cheap and therefore I will continue using them for my domain controller VMs, vSAN witness, vCenter Server appliance etc.

Yes, my new home lab is going to to be a 2-node vSAN cluster.

Motherboard

I found two Supermicro X11SPM-TF motherboards for a reduced price, because people ordered and never used them. This was my chance and a “sign” that I have to buy my stuff for the new home lab NOW! Let’s pretend it’s my Christmas gift. 😀

The key features for me?

Chassis

I went for the Fractal Design Node 804 because it offers me space for the hardware and cooling. And I like the square form factor which allows me to stack them.

CPU

I need some number of cores in my system to run tests and have enough performance in general. I will mainly run Workspace ONE and Horizon stuff (multi-site architectures) in my lab, but this will change in the future. So I have chosen the 8-core Intel Xeon Silver 4110 Processor with 2.10 GHz.

Memory

RAM was always a limiting factor with my NUCs. I will reuse two of them and start with two 32GB 2666 MHz Kingston Server Premier modules for each ESXi host (total 64GB per host). If memory prices are reducing and I would need more capacity, I easily can expand my system.

Boot Device

Samsung 860 EVO Basic 250GB which is way too much for ESXi, but the price is low and I could use the disk for something else (e.g. for a new PC) if needed.

Caching Device for vSAN

I will remove one Samsung 960 EVO 500GB M.2 of each NUC and use them for the vSAN caching tier. Both NUCs will have still one 960 EVO 500 left to be used as local storage.

Capacity Device for vSAN

Samsung 860 Evo Basic 1TB.

Network

Currently, my home network only consists of Ubiquiti network devices with 1GbE interfaces.

So I ordered the Ubiquiti 10G 16-port switch which comes with four 1/10 Gigabit RJ45 ports – no SFPs needed for now. Maybe in the future 😀

This is the home lab configuration I ordered and all parts should arrive until end of November 2018.

What do you think about this setup?

Your feedback is very welcome!