Click here to close now.




















Welcome!

Apache Authors: Dana Gardner, Liz McMillan, Mohamed El-Refaey, Ajay Budhraja, Don MacVittie

Blog Feed Post

vCloud Automation Center – vCAC 5.1 – Executing Scripts with the Linux Guest Agent

I’ve been getting asked a lot of question on how to execute scripts within Linux Guest Systems using vCAC. There are a few components to executing scripts in a Linux Guest OS which I’m going to cover in this post. Those items are:

  • Linux Guest Agent
  • Custom Properties

Linux Guest Agent

The Linux guest agent has a number of feature benefits that you receive if you utilize it. The Linux guest agent is a small agent that acts very similarly to the vCAC proxy agents. When it is installed you give it the name or IP address of the vCAC server. This allows it to check in with the server when it loads on a newly provisioned machine and determine if there is anything it needs to do. If the vCAC server has work for it to do it send the instructions and the agent executes the instructions on the local guest operating system. The guest agent comes with a number of pre-built scripts and functions, but also allows you to execute your own scripts. Some of the features available with the Linux Guest Agent are:

  • Disk Operations – Partition, Format, and mount disk that is added to the machine.
  • Execute Scripts – Execute scripts after the machine is provisioned.
  • Network Operations – Configure setting for additional network interfaces added to the machine.


This article is primarily about executing scripts with the guest agent, however by installing the guest agent if you add a disk and define the proper paramaters you will be able to utilize the Disk Operations that it supports. I will discuss networking in more details in a different article.
 

Installing the Guest Agent

1. You will need to install the Linux Guest Agent in your VMware Template that you will be using. In my example I’m using Centos 6.3 x64.
 
2. You will need to get the agent into your template either by accessing it across the network or by putting it on an ISO and mounting it to the VM. The agent are part of the vCAC installation package and are located under “LinuxGuestAgentPkgs” there you will find agents for a number of flavors of Linux.
 
3. Once you have your agent on your template you need to install the agent package. There are generally two packages, a tar.gz as well as a RPM. I will be installing the rpm file for rhel6-amd64. The specific package name is “gugent-5.1.1-56.x86_64.rpm”.
 
4. Install the package by doing the following: rpm -ivh path/gugent-5.1.1-56.x86_64.rpm. In the below image mine was already installed, but you get the idea.
vcaclgas-1
5. Next we need to execute the agent installer. You need to go to /usr/share/gugent and run ./installgugent.sh {vcac server} you can use either the IP address of hostname of the vCAC server.
vcaclgas-2
6. You can verify the proper name/IP by doing “cat rungugent.sh” and looking at the “–host=” statement on the 6th line.
vcaclgas-3
7. We next need to rename one of the scripts that is included with the Linus Guest Agent due to a bug. Navigate to “/usr/share/gugent/site/CustomizeGuestOS” In this folder you will see a script named “20_static_ip.sh” this script is depreciated and not needed. Rename the script to “old_20_static_ip.sh” by doing “mv 20_static_ip.sh old_20_static_ip.sh“.
vcaclgas-4
8. If you want to follow the example that I will be giving you should also create a folder on the template to be utilized as a mount point for an NFS share. I created “/Scripts” and my mount point by doing “mkdir /Scripts“.
 
9. Next I created a helper script in the template that I will use to mount my NFS share. I placed my script in the root of the disk, but you can place it where ever you like. A good place for it would be in “/usr/share/gugent/site/CustomizeOS“, but it really doesn’t matter where you place it.
 
10. My script is named “repomount.sh” and looks like this: “mount -t nfs $1 /Scripts” nothing fancy at all.
 
11. Once you have the script created go ahead and “shutdown” your template machine and “Convert it to a template” in “vCenter“.
 

Configure vCAC

12. If this is a “new template” that you just created you will need to “manually” perform a “Data Collection” in “vCAC“. You can do this by navigating to “Enterprise Administrator” Selecting “Compute Resources” and hover over the cluster that the template exists in and then select “Data Collection” Once there click “Request Now” under “Inventory“.
 
13. Once the “Data Collection” is complete your “template” will be ready for use.
 
14. If you don’t already have a blueprint that can deploy this “template” you will need to create one. See my article on “Connecting to vCenter” for instructions on creating a blueprint.
 
15.Next we are going to create a “Build Profile” that we can use to hold our properties. Navigate to “Enterprise Administrator” and select “Build Profiles” then select “New Build Profile“.
16. Give the “Build Profile” a “Name” and then we are going to add some “Custom Properties” to the profile. We need to add the following properties and values:

  • repo.path – This is a property that I made up for this example. It is not part of vCAC. The value should the the path to your “NFS” share in the format “IP:/volume/share” This is what we are going to mount to the folder we created earlier
  • VirtualMachine.Admin.UseGuestAgent – This tells vCAC to utilize the guest agent as part of the deployment process. The “value” should be “true“.
  • VirtualMachine.Customize.WaitComplete – This tells vCAC to wait until the vCenter Guest Customization is complete. If you do not use “Customization Specifications” you do not need this property. The “value” should be “true” if you use vCenter guest customization.
  • VirtualMachine.Software0.Name – Assign a name for the script you are going to execute. “Value” is a “Friendly Name” for your script.
  • VirtualMachine.Software0.ScriptPath – Path to your script including the script name. You can pass parameters to your script as well. I’m passing the value for the “repo.path” that I created earlier to my script to be utilizing as the “NFS location” to mount. The value is “/repomount.sh {repo.path}“. Note: The {} brackers are required in the value.
  • VirtualMachine.Software1.Name – I’m executing a second script so I’ve used the “VirtualMachine.Software1.Name” property again only with a 1 instead of a 0 this time.
  • VirtualMachine.Software1.ScriptPath – I’m executing a second script. This one s located on the NFS share that I am mounting with my first script. My script is “BashScript.sh” and it contains “echo “The script was successfully executed” > /Script_Successfull.txt

17. Click “Ok” to save the “Build Profile
vcaclgas-5
18. Next we need to assign this build profile to our “Linux Blueprint“. Go to “Enterprise Administrator” select “Global Blueprints“, then select the “Linux Blueprint” and then “edit“.
 
10. Once the “Blueprint” opens select the “Properties” tab and select the “Build Profile” you just created and then click “Ok
vcaclgas-6

So what does all this do

When you request a machine custom properties are associated with the machine. Some custom properties are what we call reserved properties because vCAC understands them and performs actions based on them. The “VirtualMachine.Admin.UseGuestAgent” property tells vCAC that when the machine is provisioning it needs to create workitems for the “Linux Guest Agent” to pick up. The “VirtualMachine.SoftwareX.Name” and “Virtualmachine.SoftewareX.ScriptPath” are put into the work item that is created by vCAC to instruct the ‘Linux Guest Agent” to execute those scripts.
 
In this example the first script will execute and mount the NFS share that I defined my my own property “repo.path”. Once complete the second script will run and execute “BashScript.sh” which will create a txt file. The guest agent performs it’s tasks after the VMware Guest Customization if being utilized otherwise it completed it after the “Machine Provisioned” state. Once the Linux Agent completed all it’s work items it will then remove itself from loading on future loads and stop it’s service.

Read the original blog entry...

More Stories By Sidney Smith

Sid Smith, founder of DailyHypervisor is considered to be a cloud expert in the IT field with over 10 years experience in Virtualization, Automation, and Cloud technologies. Sid Smith started in the industry designing and implementing large scale enterprise server and desktop virtualization environments for fortune 100 and 500 companies. He later went on to become a key employee at DynamicOps the well know creators of Cloud Automation Center. In July 2012 DynamicOps was acquired by VMware who has adopted Cloud Automation Center as a center piece for it’s vCloud Suite of products. Sid has helped dozens of fortune 100 and 500 enterprises successfully adopt both private and public cloud strategies as part of their IT offerings. The result of which was large operational and capital savings for his customers. Sid continues to help large enterprise customers reach their hybrid cloud strategies at VMware. On DailyHypervisor you will find exclusive content that will help you learn how to adopt a successful cloud strategy through the use of VMware Cloud Automation Center, Open Stack, and other industry recognized cloud solutions.

@ThingsExpo Stories
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
A producer of the first smartphones and tablets, presenter Lee M. Williams will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. In his session at @ThingsExpo, Lee Williams, COO of ETwater, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater.
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, described how to revolutionize your archit...
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with APIs within the next year.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, d...