Welcome!

Apache Authors: Pat Romanski, Liz McMillan, Elizabeth White, Christopher Harrold, Janakiram MSV

Blog Feed Post

A look at Government Computer News’ Four-part series on Text Analytics

By

Government Computer News has an in-depth examination of how text analytics are being used in the federal government. They examined how NASA is using text analytics for airline safety, how text analytics can “read between the lines” of terabytes of data, using text analytics to identify early signs of bio threats and using text analytics for agency data mining. The full four-part series can be found here, but we wanted to summarize and analyze it ourselves so we could give you our cut.

Bottom line: Great work by GCN. These guys are adding value to the dialog. Here are more thoughts:

NASA applies deep-diving text analytics to airline safety

NASA has created the Aviation Safety Program that uses text analytics to process hundreds of thousands of unstructured data reports. NASA collects data from pilot reports to mechanics logs in an attempt to identify problems, before they happen. This database was previously only viewed by human analysts, who do not have the time or cycles to process all the data. The machine processing starts with natural language processing (NLP) and machine-learning. For more, be sure to check out the full article here.

Text analytics: Reading between the lines of terabytes of data

DHS has started using text analytics to poll social media networks trying to identify signs of terrorism. Scanning social media is nothing new, but using machine learning text analytics is finding “hidden relationships” to highlight trends and public sentiment. Further details are scant, because of the pace at which adversaries adapt to the techniques, tactics and procedures (TTPs) of our governments. The article discusses capabilities that leverage Apache Hadoop, but doesn’t mention Hadoop for some reason. For the full article, check it out here.

Canary in a data mine: How analytics detects early signs of bio threats

The National Collaborative for Bio-Preparedness (NCB-Prepared) is using a system “to monitor emergency medical services reports, poison center data and a wide array of other data sets, including social media, to detect signs of biological threats.” By looking at reports, they were able to identify a gastrointestinal outbreak two months before it would have been identified by standard reporting. This system uses SAS text analytics running on North Carolina State University’s cloud-based Virtual Computing Lab. To read more, check out the full report here.

Text analytics ready for the heavy lifting of agencies’ data mining

The last article revolves around the growing need for unstructured data analytics  in the federal government. It features one of our heroes, Chris Biow, CTO of MarkLogic.

Chris Biow, federal CTO at MarkLogic, agrees. “Any agency in the government that deals in any respect with the public should be to using text analytics now,” he told GCN. “It’s maybe only being used now in 20 percent of the cases where it should. It’s as broad as treaty compliance versus watching public sentiment toward the United States overseas to predict a riot. All of that is out there.”

MarkLogic’s Biow said the most critical thing in initial implementations of text analytics is to manage expectations because machines still are not nearly as good at analyzing text as humans are. “The machine’s advantage is that it can do all the text,” he explained. “[But] you don’t have enough human beings to read it all. The machines will make a pass-over and humans can then refine that. The machines are getting better in terms of the complexity and detail that they can extract, but not necessarily in terms of the quality. That’s why it’s important to set expectations.”

“The best practice here,” Biow said, “is setting reasonable expectations. And results can definitely be improved as your users, library scientists and text analytics vendors start working together.”

There are problems because many agencies do not talk about their text analytics out in public, it is hard to get data on solutions and successes. Biow further said managing expectations can be hard as machines have much left to learn. To continue this article, check out the report here.

There are many great points in this series that we liked and we most highly recommend the series. Thanks GCN.

We hope to see follow on work by GCN along these lines, perhaps diving into the new realm of Model Enabled Analysis and capabilities like Savanna from Thetus, which is showing a great path to helping humans interact with information like that described in this GCN series.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...