Welcome!

Apache Authors: Pat Romanski, Liz McMillan, Elizabeth White, Christopher Harrold, Janakiram MSV

News Feed Item

Talend Increases Big Data Integration Performance and Scalability by 45 Percent

Hadoop Summit -- Talend, the global big data integration software leader, today announced the availability of Talend version 5.5, the latest release of the only integration platform optimized to deliver the highest performance on all leading Hadoop distributions.

Talend 5.5 enhances Talend’s performance and scalability on Hadoop by an average of 45 percent. Adoption of Hadoop is skyrocketing and companies large and small are struggling to find enough knowledgeable Hadoop developers to meet this growing demand. Only Talend 5.5 allows any data integration developer to use a visual development environment to generate native, high performance and highly scalable Hadoop code. This unlocks a large pool of development resources that can now contribute to big data projects. In addition, Talend is staying on the cutting edge of new developments in Hadoop that allow big data analytics projects to power real-time customer interactions.

Proven Performance and Scalability

It’s Talend’s mission to provide easy to use big data integration tools with the industry’s highest performing, most scalable integration code running natively on Hadoop. As a part of this mission, Talend puts every product release through a rigorous set of performance and scalability tests, including a performance benchmark developed by the Transaction Processing Performance Council, known as TPC-H. Out of the 22 standard TPC-H tests, Talend ran up to 67 percent faster with an average improvement of 45 percent across generated MapReduce code.

TPC-H testing was just the beginning. Talend also worked with one of the industry’s largest financial services companies to do real-world testing. “At Talend, we are making the fastest and the most predictable integration solutions on the market,” said Fabrice Bonan, chief product officer and co-founder of Talend. “Our ability to show true scalability and performance on a 1000-node Hadoop cluster, with such a large customer, is just one proof point.”

Talend has made numerous other performance improvements throughout the product suite. One example of this is enhancements to the Talend Data Mapper. The Talend Data Mapper is an advanced data mapping tool designed explicitly to handle complex data structures such as XML, EDI and Java objects. This kind of data mapping is particularly important for managing electronic data interchange (EDI) in the healthcare industry or for conveying financial information between banks using the Financial Product Markup Language (FPML). Talend 5.5 now includes support for very large files with the ability to stream multi-gigabyte documents into Hadoop clusters.

Igniting Real-Time Big Data

Talend 5.5 is also setting the bar for real-time or operational big data. Analytics is often just the first step in a company’s big data journey. The next step is delivering those analytics and recommendations to the right people at the right time. This is where operational big data comes in with its ability to handle thousands of simultaneous transactions in real time.

Working in conjunction with Talend alliance partner Altic, Talend Labs is supporting the future of Hadoop by working with Apache Spark for fast, large-scale data processing. Spark is ideally suited to handle high volume, high speed data situations, such as fraud detection and sensor data processing. “Our support of Apache Spark is a great example of how Talend is helping the open source community and customers take advantage of the latest innovations in big data,” said Bonan. “Instead of becoming experts in every new Hadoop project, customers can use our visual designer and Talend generates optimized code for them. This allows IT organizations to stay focused on delivering business value while we keep them on the cutting edge.”

“Talend is answering the call from customers to support operational big data, providing users with the ability to process any data volume in real-time, and helping to drive and improve business performance,” said Charly Clairmont, chief technical officer of Altic. “We are proud to have been early supporters of Apache Spark for over two years. Now alongside the experts from Talend Labs, we are able to jointly offer Spark support, enabling the future of big data. Innovation in the big data space is clearly driven by open source vendors and Talend has always shown a strong commitment to driving value for its users.”

Availability

Version 5.5 of all Talend open source products is available for immediate download from Talend’s website, www.talend.com. Experimental support for Spark code generation is also available immediately and can be downloaded from the Talend Exchange on Talendforge.org. Version 5.5 of the commercial subscription products will be available within 3 weeks and will be provided to all existing Talend customers as part of their subscription agreement. Products can be also be procured through the usual Talend representatives and partners.

To learn more about Talend 5.5 with 45 percent faster Big Data integration Performance and Scalability, register here for our June 10 webinar.

About Talend

Talend’s integration solutions allow data-driven organizations to gain instant value from all their data. Through native support of modern big data platforms, Talend takes the complexity out of integration efforts and equips IT departments to be more responsive to the demands of the business, at a predictable cost. Based on open source technologies, Talend’s scalable, future-proof solutions address all existing and emerging requirements of integration.

More than 4,000 enterprise customers worldwide rely on Talend’s solutions and services. Privately-held and dual-headquartered in Los Altos, CA and Suresnes, France, the company has offices in North America, Europe and Asia, along with a global network of technical and services partners. For more information, please visit www.talend.com and follow us on Twitter: @Talend.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...