Welcome!

Apache Authors: Elizabeth White, Pat Romanski, Liz McMillan, Christopher Harrold, Janakiram MSV

Blog Feed Post

Reflections on John Chambers’ UserR! 2014 Keynote Address

by Joseph Rickert John Chambers opened UseR! 2014 by describing how the R language grew out of early efforts to give statisticians easier access to high quality statistical software. In 1976 computational statistics was a very active field, but most algorithms were compiled as Fortran subroutines. Building models with this software was not a trivial process. First you had to write a main Fortran program to implement the model and call the right subroutines, and then you had to write the job control language code to submit your job and get it executed. When John and his Bell Labs colleagues sat down on that May afternoon to work on what would become the first implementation of the S language they were thinking about how they could make this process easier. The top half John’s famous diagram from that afternoon schematically indicates their intention to design a software interface so that one could call an arbitrary Fortran subroutine, ABC, by wrapping it in some simplified calling syntax: XABC( ).    The main idea was to bring the best computational facilities to the people doing the analysis. As John phrased it: “combine serious computational challenges with convenience”. In the end, the designers of both S, and its second incarnation, R, did much better than convenience. They built a tool to facilitate “flow”. When you are engaged in any mentally challenging work in (including statistical analysis) at a high level of play, you want to be able to stay in the zone and not get knocked out by peripheral tasks that interrupt your thought processes. As engaging and meaningful as it is in its own right, writing code is not doing statistics. One of the big advantages of working with R is that you can do quite a bit of statistics with just a handful of functions and the simplest syntax. R is a tool that helps you keep moving forward. If you want to see something then plot it. If the data in the wrong format, then mutate it. A second idea that flows from the idea of S as an interface is that S was not intended to be self sufficient. John was explicit that S was designed as an interface to the “best algorithms”, not as a “from the ground up programming language”. The idea of being able to make use of external computational resources is still compelling. There will always be high-quality stuff that we will want to get at. Moreover, as John elaborated: “unlike 38 years ago there are many possible interfaces to languages, to other computing models and to (specialized) hardware”. The challenge is to interface to applications that are “too diverse for one solution to fit them all”, and to do this “without loosing the R that works in ‘ordinary’ circumstances. John offered three examples of R projects that extend the reach of R to leverage other computing environments. Rcpp -  turns C++ in to an R function by generating an interface to C++ with much less programming effort than .Call RLLVM - enables compiling R language code into specialized forms for efficiency and other purposes H2O - provides a compressed, efficient external version of a data frame for running statistical models on large data sets. These examples, chosen to represent each of the three different kinds of interface targets that John called out, also represent projects of different scope and levels of integration. With a total of 226 reverse depends and reverse imports,  Rcpp is already a great success. It is likely that ready access to C++ will form a permanent part of the R programmers mindset. RLLVM is a much more radical and ambitious project that would allow R to be the window to entirely different computing models. As best I understand it, the central idea is to use the R environment as the system interface to “any number of new languages” perhaps languages that have not yet been invented. RLLVM would “Use R syntax for commands to be interpreted in a different interpreter”.  RLLVM seems to be a powerful idea and a direct generalization of the original XABC() idea. The RH2O package is an example of providing R users with transparent access to data sets that are too large to fit into memory. It is one of many efforts underway (including those from Revolution Analytics) to integrate Hadoop, Teradata, Spark and other specialized computing platforms within the R environment. Some of these specialized platforms may indeed be longed lived, but it is not likely that all of them will. From the point of view of doing statistics, it is the R interface that is likely to survive and persist, platforms will come and go. An implication of the willingness of R developers to embrace diversity is that R is likely to always be a work in progress. There will be loose ends, annoying inconsistencies and unimplemented possibilities. I suppose that there are people who will never be comfortable with this state of affairs. It is not unreasonable to prefer a system where there is one best way to do something and where, within the bounds of some pre-established design, there is near perfect consistency. However, the pursuit of uniformity and consistency seems to me to doom designers to be at least one step behind, because it means continually starting over to get things right. So what does this say about the future of R? John closed his talk by stating that “the best future would be one of variety, not uniformity”. I take this to mean that, for the near future anyway, whatever the next big thing is, it is likely that someone will write an R package to talk to it.  Some links regarding S and R History: John Chambers useR! 2006 slides Trevor Hastie's Interview with John Chambers Ross Ihaka: R: Past and Future History New York Times Article

Read the original blog entry...

More Stories By David Smith

David Smith is Vice President of Marketing and Community at Revolution Analytics. He has a long history with the R and statistics communities. After graduating with a degree in Statistics from the University of Adelaide, South Australia, he spent four years researching statistical methodology at Lancaster University in the United Kingdom, where he also developed a number of packages for the S-PLUS statistical modeling environment. He continued his association with S-PLUS at Insightful (now TIBCO Spotfire) overseeing the product management of S-PLUS and other statistical and data mining products.<

David smith is the co-author (with Bill Venables) of the popular tutorial manual, An Introduction to R, and one of the originating developers of the ESS: Emacs Speaks Statistics project. Today, he leads marketing for REvolution R, supports R communities worldwide, and is responsible for the Revolutions blog. Prior to joining Revolution Analytics, he served as vice president of product management at Zynchros, Inc. Follow him on twitter at @RevoDavid

IoT & Smart Cities Stories
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true ...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
DXWorldEXPO LLC announced today that "IoT Now" was named media sponsor of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and G...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
Founded in 2000, Chetu Inc. is a global provider of customized software development solutions and IT staff augmentation services for software technology providers. By providing clients with unparalleled niche technology expertise and industry experience, Chetu has become the premiere long-term, back-end software development partner for start-ups, SMBs, and Fortune 500 companies. Chetu is headquartered in Plantation, Florida, with thirteen offices throughout the U.S. and abroad.