|By Don MacVittie||
|December 12, 2012 10:00 AM EST||
All the goodness FPGAs bring hardware in general, and ADO hardware in particular.
In two previous installments, I talked at a high level about the uses of FPGAs, risk mitigation, and the potential benefits. Today I’d like to delve into the benefits that the industry in general, and F5 in particular, gain from using FPGAs, and why it matters to IT. If you’re a regular reader, you know that I try not to be a chorus line for F5 solutions, but don’t shy away from talking about them when it fits the topic. That will continue with this post. While I will use F5 for the specifics, the benefits can be generalized to the bulk of the industry.
Used to be, way back in the day, everyone walked everywhere. That worked for a long period of world history. The horse was adopted for longer trips, and it about doubled travel speed, but still, the bulk of the world populace walked nearly all of the time. Then along came cars, and they enabled a whole lot of things. One of the great benefits that the automobile introduced was the ability to be more agile. By utilizing the machinery, you could move from one town to another relatively quickly. You could even work in a town 30 miles – a days’ walk for a physically fit person – from your home. At this point in human – or at least first world – history, walking is a mode of transportation that is rarely used for important events. There are some cities so tightly packed that walking makes sense, but for most of us, we take a car the vast majority of the time. When speed is not of the essence – say when you take a walk with a loved one – the car is left behind, but for day-to-day transport, the car is the go-to tool.
There is a corollary to this phenomenon in the Application Delivery world. While in some scenarios, a software ADC will do the trick, there are benefits to hardware that mean if you have it, you’ll use the hardware much more frequently. This is true of far more than ADCs, but bear with me, I do work for an ADC vendor . There are some things that can just be done more efficiently in hardware, and some things that are best left (normally due to complexity) to software. In the case of FPGAs, low-level operations that do a lot of repetitive actions are relatively easily implemented – even to the point of FPGA and/or programming tools for FPGAs coming with certain pre-built layouts at this point. As such, certain network processing that is latency-sensitive and can be done with little high-level logic are well suited to FPGA processing. When a packet can be processed in X micro-seconds in FPGA, or in X^3 milliseconds by the time it passes through the hardware, DMA transfer, firmware/network stack, and finally lands in software that can manipulate it, definitely go with the FPGA option if possible.
And that’s where a lot of the benefits of FPGAs in the enterprise are being seen. Of course you don’t want to have your own FPGA shop and have to maintain your own installation program to reap the benefits. But vendors have sets of hardware that are largely the same and are produced en-masse. It makes sense that they would make use of FPGAs, and they do. Get that packet off the wire, and if it meets certain criteria, turn it around and get it back on the wire with minor modifications.
But that’s not all. While it was a great step to be able to utilize FPGAs in this manner and not have to pay the huge up-front fees of getting an ASIC designed and a run of them completed, the use of FPGAs didn’t stop there – indeed, it is still growing and changing. The big area that has really grown the usage of ever-larger FPGAs is in software assistance. Much like BIOS provides discrete functionality that software can call to achieve a result, FPGAs can define functions with register interface that are called directly from software – not as a solution, but as an incremental piece of the solution. This enables an increase in the utilization of FPGAs and if the functions are chosen carefully, an improvement in the overall performance of the system the FPGAs are there to support. It is, essentially, offloading workload from software. When that offload is of computationally intensive operations, the result can be a huge performance improvement. Where a software solution might have a function call, hardware can just do register writes and reads, leaving the system resources less taxed. Of course if the operation requires a lot of data storage memory, it still will, which is why I mentioned “computationally expensive”.
The key thing is to ask your vendor (assuming they use FPGAs) what they’re doing with them, and what benefit you see. It is a truth that the vast majority of vendors go to FPGAs for their own benefit, but that is not exclusive of making things better for customers. So ask them how you, as a customer, benefit.
And when you wonder why a VM can’t perform every bit as well as custom hardware, well the answer is at least partially above. The hardware functionality of custom devices must be implemented in software for a VM, and that software then runs on not one, but two operating systems, and eventually calls general purpose hardware. While VMs, like feet, are definitely good for some uses, when you need your app to be the fastest it can possibly be, hardware – specifically FPGA enhanced hardware – is the best answer, much as the car is the best answer for daily travel in most of the world. Each extra layer – generic hardware, the host operating system, the virtual network, and the guest operating system – adds cost to processing. The lack of an FPGA does too, because those low-level operations must be performed in software.
So know your needs, use the right tool for the job. I would not drive a car to my neighbors’ house – 200 feet away – nor would I walk from Green Bay to Cincinnati (just over 500 miles). Know what your needs are and your traffic is like, then ask about FPGA usage. And generalize this… To network switches, WAPs, you name it. You’re putting it into your network, so that IS your business.
Walking in Ust-Donetsk
And yeah, you’ll hear more on this topic before I wrap up the Bare Metal Blog series, but for now, keep doing what you do so well, and I’ll be back with more on testing soon.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
Feb. 27, 2015 11:15 AM EST Reads: 761
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
Feb. 27, 2015 11:00 AM EST Reads: 1,043
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and other machines.
Feb. 27, 2015 11:00 AM EST Reads: 910
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
Feb. 27, 2015 11:00 AM EST Reads: 2,312
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
Feb. 27, 2015 10:30 AM EST Reads: 2,280
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Feb. 27, 2015 10:00 AM EST Reads: 3,123
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-critical systems. ISS has completed many successful projects in Healthcare, Commercial, Manufacturing, ...
Feb. 27, 2015 10:00 AM EST Reads: 1,051
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Feb. 27, 2015 10:00 AM EST Reads: 4,535
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
Feb. 27, 2015 10:00 AM EST Reads: 3,862
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Feb. 27, 2015 09:15 AM EST Reads: 1,116
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics archive, in his session at @ThingsExpo, Jim Kaskade, Vice President and General Manager, Big Data & Ana...
Feb. 27, 2015 09:00 AM EST Reads: 1,129
For years, we’ve relied too heavily on individual network functions or simplistic cloud controllers. However, they are no longer enough for today’s modern cloud data center. Businesses need a comprehensive platform architecture in order to deliver a complete networking suite for IoT environment based on OpenStack. In his session at @ThingsExpo, Dhiraj Sehgal from PLUMgrid will discuss what a holistic networking solution should really entail, and how to build a complete platform that is scalable, secure, agile and automated.
Feb. 27, 2015 09:00 AM EST Reads: 2,170
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
Feb. 27, 2015 09:00 AM EST Reads: 2,709
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
Feb. 27, 2015 09:00 AM EST Reads: 1,994
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to make sense of it all.
Feb. 27, 2015 09:00 AM EST Reads: 888
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco Systems, will break down the core capabilities of IoT in multiple settings and expand upon IoE for bo...
Feb. 27, 2015 09:00 AM EST Reads: 966
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
Feb. 27, 2015 09:00 AM EST Reads: 575
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understanding the kinds of data: structured, unstructured, big/small? Analytics: What kinds and how responsiv...
Feb. 27, 2015 05:00 AM EST Reads: 2,151
Cloudian, Inc., the leading provider of hybrid cloud storage solutions, today announced availability of Cloudian HyperStore 5.1 software. HyperStore 5.1 is an enhanced Amazon S3-compliant, plug-and-play hybrid cloud software solution that now features full Apache Hadoop integration. Enterprises can now transform big data into smart data by running Hadoop analytics on HyperStore software and appliances. This in-place analytics, with no need to offload data to other systems for Hadoop analyses, enables customers to derive meaningful business intelligence from their data quickly, efficiently and ...
Feb. 27, 2015 04:00 AM EST Reads: 2,286
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Feb. 27, 2015 04:00 AM EST Reads: 2,792