Monthly Archives: May 2014

JISC report on “Value and benefits of text mining”

I came across this excellent report titled “Value and benefits of text mining”. The report was commissioned by JISC, “a registered charity and champion the use of digital technologies in UK education and research.

The report has “explored the costs, benefits, barriers and risks associated with text mining within UKFHE (UK further and higher education) research”.

The report rightly points out how text mining has already been explored by business “to analyse customer and competitor data to improve competitiveness”, two example mentioned are, “the pharmaceutical industry mines patents and research articles to improve drug discovery; within academic research, mining and analytics of large datasets are delivering efficiencies and new knowledge in areas as diverse as biological science, particle physics and media and communications.”

Speaking particularly about the global research community it goes to mention, it “generates over 1.5 million new scholarly articles per annum.” That’s really huge! If this scholarly articles are mined their exists huge “opportunity to support innovation”.

The report recognizes copyright laws, particularly in UK, as one of the issues that might hinder use of text mining in UK research circles. It quotes how recommendations of Hargreaves report which “proposes an exception to support text mining and analytics for non-commercial research.” can remedy this situation.

Here are some key findings of this report,

1) Text mining has tremendous scope in the area of biomedical sciences, chemistry, social sciences and humanities. Given the current copyright laws, text mining can be carried out on only Open documents, which are limited in number.

2) One has to incur huge transaction cost to gain access to text minable documents. This is attributed to “the need to negotiate a maze of licensing agreements covering the collections researchers wish to study.”

3) It arrives at both specific and broader societal benefits such as, “increased researcher efficiency; unlocking hidden information and developing new knowledge; exploring new horizons; improved research and evidence base; and improving the research process and quality. Broader economic and societal benefits include cost savings and productivity gains, innovative new service development, new business models and new medical treatments.’

I found this report an interesting read, request you to continue reading at  http://www.jisc.ac.uk/reports/value-and-benefits-of-text-mining

Text Analytics 2014: Q&A with Fiona McNeill, SAS

I post a yearly look at the Text Analytics industry — technologies and market developments — from the provider perspective. This year’s is Text Analytics 2014.

To gather background material for the post, and for my forth-coming report ”Text Analytics 2014: User Perspectives on Solutions and Providers” (which should be out by late May), I interviewed a number of industry figures: Lexalytics CEO Jeff Catlin, Clarabridge CEO Sid Banerjee, Fiona McNeill of SAS, Daedalus co-founder José Carlos González, and Tom Anderson of Anderson Analytics and OdinText. (The links behind the names will take you to the individual Q&A articles.) This article is –

Fiona McNeill, SASText Analytics 2014: Q&A with Fiona McNeill, SAS

Fiona McNeill is Global Product Marketing Manager at SAS, co-author of The Heuristics in Analytics: A Practical Perspective of What Influences Our Analytical World. The following are her December, 2013 Q&A responses:

1. How has the market for text technologies, and text-analytics-reliant solutions, changed in the past year? Any surprises?

Text analytics is now much more commonly recognized as mainstream analysis, seen to improve business decisions, insights and helping drive more efficient operations. Historically, those of us in this field spent time gaining mindshare that text should be analyzed (beyond analysis of sentiment, mind you) — and over the past year this has shifted to best practice methods of describing the ROI from text analytics to upper management. This demonstrates common recognition within organizations that there is value in doing text analysis in the first place. And now the focus has shifted to how best to frame that value for senior stakeholders.

The ease of analyzing big text data (hundreds of millions or billions of documents) has also improved over the past year, including extensions of high-performance text mining (from SAS) to new distributed architectures, like Hadoop and Cloudera. Such big content technologies will continue to expand and we can expect functionality to extend to more interactive and visual text analytics capabilities over the coming year.

2. Do you have a 2013 user story, from a customer, that really illustrates what text analytics is all about?

We can speak to customer applications that illustrate what text analytics is all about, not mentioning names unfortunately. One is a retail client, that recognized text data as a rich source, addressing a wide range of initial business challenges — from real-time digital marketing, bricks-and-mortar risk monitoring, automatic detection of issues and sentiment from customer inquiries, internal problem identification from on-line help forums, improve web purchases with more relevant content, improving predictive model scores for job candidate suitability, and more. This SAS customer understood that text data is everywhere, which means that analysis of text data will help them better answer whatever business question they have.

Another customer is a manufacturer, who strategically understands that the power of text analytics and how it improves collaboration, communication and productivity within an organization. As such, they wanted an extensible platform to address all types of text documents. They also had a wide-range of written languages that they needed to integrate into existing search and discovery methods, in order to provide more accurate and more relevant information across their entire business. This SAS customer understood the innovation that can come when resources are freed from searching, and when they are empowered with finding the answers they need and when they need it, creating an organization with “The Power to Know.”

We have a European customer announcement [that came] out in February, focused on leveraging WiFi browsing behavior and visitor profiles to create prescriptive advertising promotions in real-time to in-store shoppers. This is big data, real-time, opportunistic marketing — driven by text insights and automated operational decision advertising execution. In other words, putting big text insights to work — before the data is out of date.

3. How have perceptions and requirements surrounding sentiment analysis evolved? Where are sentiment capabilities heading, in your view?

It is no longer necessary to explain why sentiment analysis is important, it’s been largely accepted that customer, prospect and the public perception an organization is useful to understand product and brand reputation. Historically, there was a focus on how well these models worked. It’s gradually being understood that there are tradeoffs between precision and recall associated with sentiment scores, at least in some domains. Acceptance it appears (and as with any new modeling technique), has occurred within the bounds of applicability to adding previously unknown insight into the context of comments, reviews, social posts and alike.  To that end, and when a generalized methodology is used, as is the case at SAS, the sentiment polarity algorithm is evolving to examine an even broader set of scenarios — from employee satisfaction, author expertise, mood of an individual, and so forth.  Sentiment appears to be headed to the structured data analysis realm — becoming a calculated field that is used in other analysis – like predictions, forecasts, and interactive visual interrogation. And as such, identifying the ROI of sentiment analysis efforts is expected to become easier.

4. What new features or capabilities are top of your customers’ and prospects’ wish lists for 2014? And what new abilities or solutions can we expect to see from your company in the coming year?

At SAS, all software development is driven by our customer needs — and so products you see coming from SAS are based on what they told us require to solve business challenges and take advantage of market opportunities. For text analytics, our customers continue to want to more interactive text visualizations — to make it even easier to explore data to both derive analysis questions and to understand the insights from text results. They want easier methods to develop and deploy text models. Our customers also want more automation to simplify the more arduous text related tasks, like taxonomy development. They want to easily access the text, understand it, the sentiment expressed in it, extract facts and define semantic relationships — all in one, easy-to-use environment. They don’t want to learn a programming language, spend time and resource integrating different technologies or use multiple software packages.  We’ve responded to this with the introduction of SAS Contextual Analysis — that will, by mid-year 2014 expand to provide an extremely comprehensive, easy-to-use and highly visual environment for interactively examining and analyzing text data. It leverages the power of machine learning and includes with end-user subject matter expertise.

We will also continue to extend technologies and methods for examining big text data — continuing to taking advantage of multi-core processing and distributed memory architectures for addressing even the most complex operational challenges and decisions that our customers have. We have seen the power of analyzing big data with real-time data-driven operations and will continue to extend platforms, analytic methods and deployment strategies for our customers. In October, 2013, we announced our strategic partnership with SAP — to bring SAS in-memory analytics to the SAP HANA platform. You’ll see our joint market solutions announced over the coming year.

5. Mobile’s growth is only accelerating, complicating the data picture, accompanied by a desire for faster, more accurate, and more useful, situational insights delivery. How are you keeping up?

With a single platform for all SAS capabilities we have ability to interchange a wide range of technologies, which can easily be brought together to solve even the most complex analytic business challenges, for mobile or other types of real-time insight delivery. SAS offers a number of real-time deployment options, including SAS Decision Manager (for deploying analytically sound operational rules), SAS Event Stream Processing Engine (for analytic processing within event streams), SAS Scoring Accelerator for Hadoop (as well as other big data stores – for real-time model deployment), and real-time environments for analyzing and reporting data – that operate on mobile devices, such as SAS Visual AnalyticsSAS also has native read/write engines and support for web services, and as mentioned above, we have recently announced strategic partnership with SAP for joint technology offerings bring the power of analytics to the SAP/HANA platform.

We are constantly extending such capabilities, recognizing that information processing is bigger, faster and more dependent on well-designed analytic insight (including that from text data) than ever before. This growing need will only continue.

6. Where does the greatest opportunity reside, for you as a solution provider? Internationalization? Algorithms, visualization, or other technical advances? In data integration and synthesis and expansion to new data sources? In providing the means for your customers to monetize data, or in monetizing data yourselves? In untapped business domains or in greater uptake in the domains you already serve?

Given our extensive portfolio of solutions, SAS continues to invest in technology advances that our customers tell us they want to address the growing complexities of their business. This includes ongoing advances in algorithms, deployment mechanisms, data access, processing routines and other technical considerations. We continue to expand our extensive native language support, with over 30 languages and dialects already available in our text analytics products. Additional languages will be added as customer needs dictate. And while we already offer solutions to virtually every industry, we continue to further develop these products to provide leading edge capabilities for big data, high-performance, real-time analytically-driven results for our customers. You’ll see us moving more and more of our capabilities to cloud architectures.  For SAS, another great opportunity is the production deployment of analytics to automate, streamline and advance the activities of our customers. You’ll continue to see announcements from us over the coming year.

7. Do you have anything to add, regarding the 2014 outlook for text analytics and your company?

At SAS, text data is recognized as being a rich source of insight that can improve data quality, accessibility and decision-making. As such, you’ll see text-based processing capabilities in products outside of the pure-play text analytics technologies. And because of the common infrastructure that has been designed by SAS — all of these capabilities are readily integrated, and can be used to address a specific business scenario.  We will continue to extend text-based processing and insights into traditional predictive analysis, forecasting and optimization – as well as new solutions that include text analysis methods, and updates to existing products, like SAS Visual Analytics and our upcoming release of a new in-memory product for Hadoop (release announcement pending).  From a foundational perspective, text-based processing continues to be extended throughout our platform, with pending linguistic rules augmenting business and predictive scoring in real-time data streams, with extensions to analytically derived metadata from text and more.  And given the nature and recognition of text and what it can bring to improved insights, you’ll also see our industry solutions continue to extend the use of text-based knowledge.

Massachusetts Invests In Big Data Innovation

State provides $3 million to launch Massachusetts Open Cloud Project, a university-industry partnership to build a new public cloud computing infrastructure for big data researchers, innovators.

The Commonwealth of Massachusetts is providing $3 million in funding to launch its Open Cloud project, a university-industry partnership to build a new public cloud computing infrastructure for big data innovation.

The investment will come from the Collaborative Research and Development Matching Grant Fund, created as part of the Economic Development Bill signed by Massachusetts governor Deval Patrick in August 2012. It will be matched by $16 million from industry partners and universities, Patrick said during last week’s big data event at Massachusetts Green High Performance Computing Center, which will house the hardware platform for the project.

“Massachusetts Open Cloud will be a virtual laboratory to big data researchers and innovators in industry, academia, and government across the Commonwealth,” Patrick said. “It will be a forum to experiment across our silos with solutions to big problems.”

The Open Cloud project is backed by a number of large companies that include Cisco, EMC, SGI, Red Hat, Juniper, Dell, and Intel, among others. They will provide engineering and operational talent, equipment, financial support, and business guidance. On the academia side, Boston University is leading the overall project. Harvard University, MIT, Northeastern University, and University of Massachusetts are also part of the mix.

[New online tool helps transitioning service members connect with employers. Read White House Website Helps Veterans Find Jobs.]

According to a description on Boston University’s website, the Massachusetts Open Cloud is a “new, transformative model for public clouds.” It’s an open, customizable approach to the design and operation of cloud computing. Instead of universities and researchers working independently or with a handful of vendors, Open Cloud participants can share ideas quickly with each other to try new approaches and come up with best practices by using open technology, Dave Egts, chief technologist for Red Hat’s US Public Sector, said in an email.

“The Massachusetts Open Cloud lets new ideas and different workloads quickly blossom on a range of hardware and software in a transparent and flexible way. If this was implemented on any one particular vendor’s public cloud, this consortium’s innovation can only happen above the cloud layer provided to them,” Egts said. “By including the underlying cloud hardware and software in the mix, they have choice from top to bottom, allowing them to tailor solutions to address a much more diverse set of workloads and solve problems faster.”

The project is part of a larger Big Data Initiative, which Patrick announced in 2012 to accelerate Massachusetts’ leadership in big data.

The global big data market is expected to reach $48 billion by 2017, up from $11.6 billion in 2012, according to the 2014 Mass Big Data Report, also released last week. Hardware and services will continue to account for the greatest share of revenue, but the fastest-growing sector is expected to be in big data-enabled applications. The need for big data applications in healthcare, life sciences, and financial services is prompting local firms to hire talent, seeking to fill as many as 3,000 big data-related jobs in Massachusetts over the next 12 months, the report found.

The report recommended various steps to help Massachusetts reach its big data goals, including strengthening opportunities for data science education and training, increasing regional talent retention and industry recruiting, and expanding access to open and public data.

Find out how a government program is putting cloud computing on the fast track to better security. Also in the Cloud Security issue of InformationWeek Government: Defense CIO Teri Takai on why FedRAMP helps everyone.

Business Intelligence And The Future Of Manufacturing

Today’s plant floors generate unimaginable volumes of data. That data can help a manufacturer increase throughput, understand where it is most exposed to risk and respond to customer demands in near-real time. Yet in the manufacturing sector, many businesses struggle to maintain pace with their data.

A 2011 report on business intelligence (BI) in manufacturing by analyst firm Ventana Research found that 59 percent of manufacturing businesses reported their data was only somewhat accurate, and that 47 percent of manufacturers regularly delivered monthly, quarterly, and yearly reports beyond 7 days after the period ended.

Data that is inaccurate and delayed can harm any business. Manufacturers certainly aren’t alone—their counterparts in retail, finance and healthcare, for example, all grapple with how to make sense of the massive amount of information generated by the data-driven nature of today’s businesses.

Manufacturers are battling narrow profit margins, intensifying competitive pressures, and buyers who have fewer dollars to spend.

Manufacturing companies are also faced with operations and external supply chain activities that have become extremely complex, making them much harder to streamline, track, and control. As a result, manufacturers need new ways to optimize productivity, improve customer service, expand market share, increase revenue, and minimize expenses.

The Value of Business Intelligence

The answers for the manufacturing industry are within information created by their daily processes.

That data is located across diverse systems and locations, produced in many different forms and available to inform business processes in various ways. Businesses in manufacturing collect and track information from multiple enterprise systems, supplies and costs, real-time external feeds, customer and partner communications, financial information systems and industry market fluctuations.

Unyielding data streams mean it’s no longer up to IT and analytics professionals alone to manage business intelligence. BI and analytics must be pervasive in a manufacturing company, where business professionals with different responsibilities should be able view, interpret and act on data during the course of business.

But the Ventana Research report also found manufacturing is overly reliant on outmoded models for crunching data. The firm reported 90 percent of manufacturing companies use spreadsheets regularly, and 48 percent use spreadsheets for BI and analytics. The regular use of spreadsheets amounted to a two-day lag in providing metrics and key performance indicators (KPIs) to decision makers in the business, according to the report.

Ventana Research concluded only 12 percent of all manufacturing-focused corporations attained the firm’s highest ranking for maturity in their use of analytics.

BI in the Cloud for Manufacturing

Manufacturing businesses often use Enterprise Resource Planning (ERP) tools to manage all aspects of their business. ERP can now be delivered via the cloud for flexible and scalable management.

Some cloud-based ERP tools, like the Plex Manufacturing Cloud from Plex Systems Inc., are modernizing how manufacturers handle and share their data. The Plex Manufacturing Cloud contains an embedded BI platform called IntelliPlex that allows its manufacturing customers to quickly and easily view and manipulate data to gain business insights.

IntelliPlex, rebranded from the WebFOCUS InfoAssist platform from business intelligence provider Information Builders, uses a set of web-based design tools that lets people drag and drop data elements into custom reports and dashboards. Because it is a cloud-based manufacturing ERP software, companies can avoid a large capital expenditure for IT hardware and upgrades are not required because new features and functions are available immediately.

For manufacturers, such a cloud-based ERP module with built-in BI can be embedded into a larger solution or rebranded as its own application.

This means even non-technical users in a manufacturing business can filter information, visualize it through interactive displays, and create reports, charts, scorecards, and dashboards to share with others – all without assistance from the IT department. Many manufacturers use the tool to create reports on manufacturing metrics, key performance indicators (KPIs), scorecard data and cash flow. As reports are created, they can be automatically e-mailed to other users or set up for scheduled distribution. In short, BI is seamlessly integrated into the business’s operation.

In addition, many manufacturers define their own KPIs to assist with traceability, which involves finding problems at the source and then taking action to remediate those problems.

For example, if a brake manufacturer learns about a bad shipment of brake pads, it can use IntelliPlex to trace the serial number and then track the associated lots through the manufacturing process to determine where the defect originated. A sales manager could follow the same approach to examine results by region or sector, and then drill down to individual invoices to determine which sales people are performing well.

Other reports assist with inventory management by keeping workers apprised of projected and actual quantities during manufacturing, assembly, and distribution processes.

Traceability is critical in manufacturing, an industry in which businesses must coordinate product recalls and accountability issues. BI reports help them to uncover problems, maximize profitability and extend best practices inside the business from department to department and externally, with partners and customers.

Cloud-based BI platforms can also help manufacturers solve analyzing operational metrics.

A typical mid-sized manufacturer might have hundreds of machines working around the clock. Assembly lines generate information from networked sensors that can be used for equipment configuration, troubleshooting, quality control, and maintenance purposes. Each machine captures an immense volume of data at each stage of the manufacturing process. Manufacturers need to continually examine this data to circumvent problems and keep the operation running at peak capacity.

Manufacturing and Data Discovery

The kind of real-time, ground level insight offered by a cloud-based ERP platform with business intelligence capabilities helps manufacturers manage information in a way that benefits every user in the organization, from the shop floor to the CEO’s office.

Information is key to data discovery. Data discovery is a term commonly used to describe the process of collecting and consolidating information from various back-end systems and sources, then using interactive dashboards to manipulate and analyze it to uncover patterns, trends, relationships, and anomalies.

BI in the cloud and easy-to-use data discovery tools can erase barriers to business intelligence adoption in manufacturing. Among the most common are the absence of resources, lack of budget, and a lack of awareness of BI and analytics, according to Ventana Research.

But the continuous innovation of emerging technologies can speed the rate at which manufacturing companies become adopters of BI platforms. Through better management of their growing data assets, organizations can create an analytics-driven culture to improve business performance, foster innovation to transform processes, and share strategic insights.

Dresner Advisory Services publishes new Wisdom of Crowds Collaborative Business Intelligence market study

Dresner Advisory Services publishes new Wisdom of Crowds Collaborative Business Intelligence market study
Users Demonstrate a Renewed Interest in Collaborative Business Intelligence Capabilities, According to this Report.

Dresner Advisory Services today launched its 3rd annual Wisdom of Crowds® Collaborative Business Intelligence Market Study. The 2014 edition of this report explores deployment trends and user sentiment for Collaborative Business Intelligence, a process where people and organizations work together to develop a common understanding that is shared and used to support better decision making across the organization.

The new report builds upon the 2012 and 2013 editions to include more year-over-year comparisons, which, in addition to several new analyses, provide a valuable tool for anyone evaluating or investing in collaborative-based business intelligence (BI) products and services.

“We are excited to release our latest Collaborative BI Market Study which has uncovered several fascinating user trends,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “For example, this year there is a rebound in interest in Collaborative BI, placing it above such high-profile topics such as big data and social media. In addition, we see a much closer alignment between the tools and capabilities that users want and the features and functions that vendors are incorporating into their offerings.”

“In 2012, Dresner Advisory was the first analyst organization to explore the topic of Collaborative BI, and this year’s analysis proves that the Collaborative BI dynamic is indeed a trend worthy of increased examination and analysis,” continued Mr. Dresner.

The Wisdom of Crowds Collaborative Business Intelligence Market Study is based on primary research conducted in February and March 2014. A wide range of BI users and implementers from around the world contributed via an online survey to gauge their current usage as well as intended deployment strategies for Collaborative BI capabilities.

For Mobile BI, small retail sales organizations should be your target

This would be according to a 2013 study published by Dresner Advisory Services, a research and advisory firm that has been offering compelling research in the area of Business Intelligence since 2007, they published their first Wisdom of Crowds BI Market study in 2010, since then they have been publishing this yearly study which is much awaited by the industry.

Here is a roundup of one of their 2013 study titled “Wisdom of Crowds – Mobile Computing / Mobile Business Intelligence Market Study “.

“In 2013 those stating that Mobile BI is “critical” or “very important” declined from 61% to 57%. This is believed to be normal, as organizations move from the planning stages to deployment. “
“From a geographical perspective, the importance of mobile BI is far greater in developing countries than in North America and EMEA. This is due to the leap-frog effect of cell phones in developing nations; those without traditional BI are investing in Mobile BI first.”

“From a vertical industry perspective, Retail & Wholesale assign Mobile BI the highest importance and have been amongst the most ambitious in its adoption, particularly in store and field management. Other industries that placed a relatively high importance on Mobile BI are Technology and Food, Beverage & Tobacco.”

“Small organizations quickly embrace Mobile BI in comparison to larger enterprises. Nearly twice as many small organizations ranked it as critical vis-a-vis larger ones.“

“Those users whose roles require them to be “nomadic” (outside of the office more than inside) tend to be amongst the most well prepared for Mobile BI, as mobile technology has historically been mandatory to do their jobs. Hence, executives and Sales & Marketing are the most prepared and Finance and IT are the least prepared.”

Vertical industries, “those that are most nomadic should also be the most culturally prepared.

However, this is only partially true. Those that perceive themselves to be prepared must also be willing to change and innovate. Key industries such as Retail have been amongst the earliest adopters of Mobile BI. In contrast, Healthcare appears less well prepared.”

Read the complete study here .

Source – Wisdom of Crowds – Mobile Computing / Mobile Business Intelligence Market Study 2013.

Category – Mobile BI