Apache Authors: Pat Romanski, Liz McMillan, Elizabeth White, Christopher Harrold, Janakiram MSV

Related Topics: Apache, Java IoT, Microservices Expo, Open Source Cloud

Apache: Blog Feed Post

DB2 Is 30 Years Old Next Month

All this was done before the Internet was invented and memory and disks were expensive commodities

Daryl Taft’s article in eWeek reminded me that next month, on June 6th. IBM’s DB2 RDBMS product will celebrate its 30th. anniversary. This has a personal significance for me. I was part of the DB2 planning team then and on June 6th. 1983, I was in Lyon, France at the European user group meeting, ready to announce IBM’s new RDBMS on MVS called DB2. Interestingly, I had prepared two presentation decks: one for DB2, and the other for IBM’s Database directions. The second one was in hand, in case the announcement could not clear all the IBM approval process on time. Luckily I was clear to go with the announcement of the new production-ready RDBMS product called DB2 to run on the mainframe MVS platform. I still recall the excitement of doing that in front of 2000 people in the gastronomic capital of France, Lyon. Later that evening, the attendees were taken by buses to the Beaujolais winery for the evening dinner.

Why was this significant? IBM Research had worked on a prototype called System R and that was commercialized on the VM platform with the name of SQL/DS.  Even though it supported the relational model and SQL, it lacked the DBMS-robustness such as scalability, performance, and reliability. In the mean time, Oracle got started in 1977 and its first product based on System R principles and SQL was introduced in 1979 on DEC/VAX. There was a gap of four years when IBM did not have a commercial RDBMS on its flagship platform MVS. The only DBMS on MVS was IMS based on hierarchical data model and DL/1 proprietary language. One of the internal debates was on the positioning of the new RDBMS when IMS was so significant a revenue generator. I recall the “dual database strategy” presentation we used to give (which one to use when). One good thing about DB2 was that the bottom layer of the engine (buffering, locking, latching, backup-recovery, write-ahead log, etc.) drew a lot of lessons from the user experience of IMS. Hence DB2 had superior  industrial-strength features than its research cousin SQL/DS as well as Oracle.

The next year in 1984, I went to IBM’s Austin Lab for two years, to lay the foundation work for DB2 for the IBM PC (OS/2). Subsequently the development was shifted to IBM Toronto lab. I personally headed a team doing the early work of porting DB2 to Unix in the year 1990-91.

All this was done before the Internet was invented and memory and disks were expensive commodities. Now the scene has changed a great deal and we see so many new types of database engines coming to market to address the needs of extreme scale and huge volumes of data. IBM continues to be a lead player in the data management and analytics business.

It feels good to be part of that history. Happy birthday DB2.

Read the original blog entry...

More Stories By Jnan Dash

Jnan Dash is Senior Advisor at EZShield Inc., Advisor at ScaleDB and Board Member at Compassites Software Solutions. He has lived in Silicon Valley since 1979. Formerly he was the Chief Strategy Officer (Consulting) at Curl Inc., before which he spent ten years at Oracle Corporation and was the Group Vice President, Systems Architecture and Technology till 2002. He was responsible for setting Oracle's core database and application server product directions and interacted with customers worldwide in translating future needs to product plans. Before that he spent 16 years at IBM. He blogs at http://jnandash.ulitzer.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...