Author Archives: admin

JISC report on “Value and benefits of text mining”

I came across this excellent report titled “Value and benefits of text mining”. The report was commissioned by JISC, “a registered charity and champion the use of digital technologies in UK education and research.

The report has “explored the costs, benefits, barriers and risks associated with text mining within UKFHE (UK further and higher education) research”.

The report rightly points out how text mining has already been explored by business “to analyse customer and competitor data to improve competitiveness”, two example mentioned are, “the pharmaceutical industry mines patents and research articles to improve drug discovery; within academic research, mining and analytics of large datasets are delivering efficiencies and new knowledge in areas as diverse as biological science, particle physics and media and communications.”

Speaking particularly about the global research community it goes to mention, it “generates over 1.5 million new scholarly articles per annum.” That’s really huge! If this scholarly articles are mined their exists huge “opportunity to support innovation”.

The report recognizes copyright laws, particularly in UK, as one of the issues that might hinder use of text mining in UK research circles. It quotes how recommendations of Hargreaves report which “proposes an exception to support text mining and analytics for non-commercial research.” can remedy this situation.

Here are some key findings of this report,

1) Text mining has tremendous scope in the area of biomedical sciences, chemistry, social sciences and humanities. Given the current copyright laws, text mining can be carried out on only Open documents, which are limited in number.

2) One has to incur huge transaction cost to gain access to text minable documents. This is attributed to “the need to negotiate a maze of licensing agreements covering the collections researchers wish to study.”

3) It arrives at both specific and broader societal benefits such as, “increased researcher efficiency; unlocking hidden information and developing new knowledge; exploring new horizons; improved research and evidence base; and improving the research process and quality. Broader economic and societal benefits include cost savings and productivity gains, innovative new service development, new business models and new medical treatments.’

I found this report an interesting read, request you to continue reading at  http://www.jisc.ac.uk/reports/value-and-benefits-of-text-mining

Text Analytics 2014: Q&A with Fiona McNeill, SAS

I post a yearly look at the Text Analytics industry — technologies and market developments — from the provider perspective. This year’s is Text Analytics 2014.

To gather background material for the post, and for my forth-coming report ”Text Analytics 2014: User Perspectives on Solutions and Providers” (which should be out by late May), I interviewed a number of industry figures: Lexalytics CEO Jeff Catlin, Clarabridge CEO Sid Banerjee, Fiona McNeill of SAS, Daedalus co-founder José Carlos González, and Tom Anderson of Anderson Analytics and OdinText. (The links behind the names will take you to the individual Q&A articles.) This article is –

Fiona McNeill, SASText Analytics 2014: Q&A with Fiona McNeill, SAS

Fiona McNeill is Global Product Marketing Manager at SAS, co-author of The Heuristics in Analytics: A Practical Perspective of What Influences Our Analytical World. The following are her December, 2013 Q&A responses:

1. How has the market for text technologies, and text-analytics-reliant solutions, changed in the past year? Any surprises?

Text analytics is now much more commonly recognized as mainstream analysis, seen to improve business decisions, insights and helping drive more efficient operations. Historically, those of us in this field spent time gaining mindshare that text should be analyzed (beyond analysis of sentiment, mind you) — and over the past year this has shifted to best practice methods of describing the ROI from text analytics to upper management. This demonstrates common recognition within organizations that there is value in doing text analysis in the first place. And now the focus has shifted to how best to frame that value for senior stakeholders.

The ease of analyzing big text data (hundreds of millions or billions of documents) has also improved over the past year, including extensions of high-performance text mining (from SAS) to new distributed architectures, like Hadoop and Cloudera. Such big content technologies will continue to expand and we can expect functionality to extend to more interactive and visual text analytics capabilities over the coming year.

2. Do you have a 2013 user story, from a customer, that really illustrates what text analytics is all about?

We can speak to customer applications that illustrate what text analytics is all about, not mentioning names unfortunately. One is a retail client, that recognized text data as a rich source, addressing a wide range of initial business challenges — from real-time digital marketing, bricks-and-mortar risk monitoring, automatic detection of issues and sentiment from customer inquiries, internal problem identification from on-line help forums, improve web purchases with more relevant content, improving predictive model scores for job candidate suitability, and more. This SAS customer understood that text data is everywhere, which means that analysis of text data will help them better answer whatever business question they have.

Another customer is a manufacturer, who strategically understands that the power of text analytics and how it improves collaboration, communication and productivity within an organization. As such, they wanted an extensible platform to address all types of text documents. They also had a wide-range of written languages that they needed to integrate into existing search and discovery methods, in order to provide more accurate and more relevant information across their entire business. This SAS customer understood the innovation that can come when resources are freed from searching, and when they are empowered with finding the answers they need and when they need it, creating an organization with “The Power to Know.”

We have a European customer announcement [that came] out in February, focused on leveraging WiFi browsing behavior and visitor profiles to create prescriptive advertising promotions in real-time to in-store shoppers. This is big data, real-time, opportunistic marketing — driven by text insights and automated operational decision advertising execution. In other words, putting big text insights to work — before the data is out of date.

3. How have perceptions and requirements surrounding sentiment analysis evolved? Where are sentiment capabilities heading, in your view?

It is no longer necessary to explain why sentiment analysis is important, it’s been largely accepted that customer, prospect and the public perception an organization is useful to understand product and brand reputation. Historically, there was a focus on how well these models worked. It’s gradually being understood that there are tradeoffs between precision and recall associated with sentiment scores, at least in some domains. Acceptance it appears (and as with any new modeling technique), has occurred within the bounds of applicability to adding previously unknown insight into the context of comments, reviews, social posts and alike.  To that end, and when a generalized methodology is used, as is the case at SAS, the sentiment polarity algorithm is evolving to examine an even broader set of scenarios — from employee satisfaction, author expertise, mood of an individual, and so forth.  Sentiment appears to be headed to the structured data analysis realm — becoming a calculated field that is used in other analysis – like predictions, forecasts, and interactive visual interrogation. And as such, identifying the ROI of sentiment analysis efforts is expected to become easier.

4. What new features or capabilities are top of your customers’ and prospects’ wish lists for 2014? And what new abilities or solutions can we expect to see from your company in the coming year?

At SAS, all software development is driven by our customer needs — and so products you see coming from SAS are based on what they told us require to solve business challenges and take advantage of market opportunities. For text analytics, our customers continue to want to more interactive text visualizations — to make it even easier to explore data to both derive analysis questions and to understand the insights from text results. They want easier methods to develop and deploy text models. Our customers also want more automation to simplify the more arduous text related tasks, like taxonomy development. They want to easily access the text, understand it, the sentiment expressed in it, extract facts and define semantic relationships — all in one, easy-to-use environment. They don’t want to learn a programming language, spend time and resource integrating different technologies or use multiple software packages.  We’ve responded to this with the introduction of SAS Contextual Analysis — that will, by mid-year 2014 expand to provide an extremely comprehensive, easy-to-use and highly visual environment for interactively examining and analyzing text data. It leverages the power of machine learning and includes with end-user subject matter expertise.

We will also continue to extend technologies and methods for examining big text data — continuing to taking advantage of multi-core processing and distributed memory architectures for addressing even the most complex operational challenges and decisions that our customers have. We have seen the power of analyzing big data with real-time data-driven operations and will continue to extend platforms, analytic methods and deployment strategies for our customers. In October, 2013, we announced our strategic partnership with SAP — to bring SAS in-memory analytics to the SAP HANA platform. You’ll see our joint market solutions announced over the coming year.

5. Mobile’s growth is only accelerating, complicating the data picture, accompanied by a desire for faster, more accurate, and more useful, situational insights delivery. How are you keeping up?

With a single platform for all SAS capabilities we have ability to interchange a wide range of technologies, which can easily be brought together to solve even the most complex analytic business challenges, for mobile or other types of real-time insight delivery. SAS offers a number of real-time deployment options, including SAS Decision Manager (for deploying analytically sound operational rules), SAS Event Stream Processing Engine (for analytic processing within event streams), SAS Scoring Accelerator for Hadoop (as well as other big data stores – for real-time model deployment), and real-time environments for analyzing and reporting data – that operate on mobile devices, such as SAS Visual AnalyticsSAS also has native read/write engines and support for web services, and as mentioned above, we have recently announced strategic partnership with SAP for joint technology offerings bring the power of analytics to the SAP/HANA platform.

We are constantly extending such capabilities, recognizing that information processing is bigger, faster and more dependent on well-designed analytic insight (including that from text data) than ever before. This growing need will only continue.

6. Where does the greatest opportunity reside, for you as a solution provider? Internationalization? Algorithms, visualization, or other technical advances? In data integration and synthesis and expansion to new data sources? In providing the means for your customers to monetize data, or in monetizing data yourselves? In untapped business domains or in greater uptake in the domains you already serve?

Given our extensive portfolio of solutions, SAS continues to invest in technology advances that our customers tell us they want to address the growing complexities of their business. This includes ongoing advances in algorithms, deployment mechanisms, data access, processing routines and other technical considerations. We continue to expand our extensive native language support, with over 30 languages and dialects already available in our text analytics products. Additional languages will be added as customer needs dictate. And while we already offer solutions to virtually every industry, we continue to further develop these products to provide leading edge capabilities for big data, high-performance, real-time analytically-driven results for our customers. You’ll see us moving more and more of our capabilities to cloud architectures.  For SAS, another great opportunity is the production deployment of analytics to automate, streamline and advance the activities of our customers. You’ll continue to see announcements from us over the coming year.

7. Do you have anything to add, regarding the 2014 outlook for text analytics and your company?

At SAS, text data is recognized as being a rich source of insight that can improve data quality, accessibility and decision-making. As such, you’ll see text-based processing capabilities in products outside of the pure-play text analytics technologies. And because of the common infrastructure that has been designed by SAS — all of these capabilities are readily integrated, and can be used to address a specific business scenario.  We will continue to extend text-based processing and insights into traditional predictive analysis, forecasting and optimization – as well as new solutions that include text analysis methods, and updates to existing products, like SAS Visual Analytics and our upcoming release of a new in-memory product for Hadoop (release announcement pending).  From a foundational perspective, text-based processing continues to be extended throughout our platform, with pending linguistic rules augmenting business and predictive scoring in real-time data streams, with extensions to analytically derived metadata from text and more.  And given the nature and recognition of text and what it can bring to improved insights, you’ll also see our industry solutions continue to extend the use of text-based knowledge.

Massachusetts Invests In Big Data Innovation

State provides $3 million to launch Massachusetts Open Cloud Project, a university-industry partnership to build a new public cloud computing infrastructure for big data researchers, innovators.

The Commonwealth of Massachusetts is providing $3 million in funding to launch its Open Cloud project, a university-industry partnership to build a new public cloud computing infrastructure for big data innovation.

The investment will come from the Collaborative Research and Development Matching Grant Fund, created as part of the Economic Development Bill signed by Massachusetts governor Deval Patrick in August 2012. It will be matched by $16 million from industry partners and universities, Patrick said during last week’s big data event at Massachusetts Green High Performance Computing Center, which will house the hardware platform for the project.

“Massachusetts Open Cloud will be a virtual laboratory to big data researchers and innovators in industry, academia, and government across the Commonwealth,” Patrick said. “It will be a forum to experiment across our silos with solutions to big problems.”

The Open Cloud project is backed by a number of large companies that include Cisco, EMC, SGI, Red Hat, Juniper, Dell, and Intel, among others. They will provide engineering and operational talent, equipment, financial support, and business guidance. On the academia side, Boston University is leading the overall project. Harvard University, MIT, Northeastern University, and University of Massachusetts are also part of the mix.

[New online tool helps transitioning service members connect with employers. Read White House Website Helps Veterans Find Jobs.]

According to a description on Boston University’s website, the Massachusetts Open Cloud is a “new, transformative model for public clouds.” It’s an open, customizable approach to the design and operation of cloud computing. Instead of universities and researchers working independently or with a handful of vendors, Open Cloud participants can share ideas quickly with each other to try new approaches and come up with best practices by using open technology, Dave Egts, chief technologist for Red Hat’s US Public Sector, said in an email.

“The Massachusetts Open Cloud lets new ideas and different workloads quickly blossom on a range of hardware and software in a transparent and flexible way. If this was implemented on any one particular vendor’s public cloud, this consortium’s innovation can only happen above the cloud layer provided to them,” Egts said. “By including the underlying cloud hardware and software in the mix, they have choice from top to bottom, allowing them to tailor solutions to address a much more diverse set of workloads and solve problems faster.”

The project is part of a larger Big Data Initiative, which Patrick announced in 2012 to accelerate Massachusetts’ leadership in big data.

The global big data market is expected to reach $48 billion by 2017, up from $11.6 billion in 2012, according to the 2014 Mass Big Data Report, also released last week. Hardware and services will continue to account for the greatest share of revenue, but the fastest-growing sector is expected to be in big data-enabled applications. The need for big data applications in healthcare, life sciences, and financial services is prompting local firms to hire talent, seeking to fill as many as 3,000 big data-related jobs in Massachusetts over the next 12 months, the report found.

The report recommended various steps to help Massachusetts reach its big data goals, including strengthening opportunities for data science education and training, increasing regional talent retention and industry recruiting, and expanding access to open and public data.

Find out how a government program is putting cloud computing on the fast track to better security. Also in the Cloud Security issue of InformationWeek Government: Defense CIO Teri Takai on why FedRAMP helps everyone.

Business Intelligence And The Future Of Manufacturing

Today’s plant floors generate unimaginable volumes of data. That data can help a manufacturer increase throughput, understand where it is most exposed to risk and respond to customer demands in near-real time. Yet in the manufacturing sector, many businesses struggle to maintain pace with their data.

A 2011 report on business intelligence (BI) in manufacturing by analyst firm Ventana Research found that 59 percent of manufacturing businesses reported their data was only somewhat accurate, and that 47 percent of manufacturers regularly delivered monthly, quarterly, and yearly reports beyond 7 days after the period ended.

Data that is inaccurate and delayed can harm any business. Manufacturers certainly aren’t alone—their counterparts in retail, finance and healthcare, for example, all grapple with how to make sense of the massive amount of information generated by the data-driven nature of today’s businesses.

Manufacturers are battling narrow profit margins, intensifying competitive pressures, and buyers who have fewer dollars to spend.

Manufacturing companies are also faced with operations and external supply chain activities that have become extremely complex, making them much harder to streamline, track, and control. As a result, manufacturers need new ways to optimize productivity, improve customer service, expand market share, increase revenue, and minimize expenses.

The Value of Business Intelligence

The answers for the manufacturing industry are within information created by their daily processes.

That data is located across diverse systems and locations, produced in many different forms and available to inform business processes in various ways. Businesses in manufacturing collect and track information from multiple enterprise systems, supplies and costs, real-time external feeds, customer and partner communications, financial information systems and industry market fluctuations.

Unyielding data streams mean it’s no longer up to IT and analytics professionals alone to manage business intelligence. BI and analytics must be pervasive in a manufacturing company, where business professionals with different responsibilities should be able view, interpret and act on data during the course of business.

But the Ventana Research report also found manufacturing is overly reliant on outmoded models for crunching data. The firm reported 90 percent of manufacturing companies use spreadsheets regularly, and 48 percent use spreadsheets for BI and analytics. The regular use of spreadsheets amounted to a two-day lag in providing metrics and key performance indicators (KPIs) to decision makers in the business, according to the report.

Ventana Research concluded only 12 percent of all manufacturing-focused corporations attained the firm’s highest ranking for maturity in their use of analytics.

BI in the Cloud for Manufacturing

Manufacturing businesses often use Enterprise Resource Planning (ERP) tools to manage all aspects of their business. ERP can now be delivered via the cloud for flexible and scalable management.

Some cloud-based ERP tools, like the Plex Manufacturing Cloud from Plex Systems Inc., are modernizing how manufacturers handle and share their data. The Plex Manufacturing Cloud contains an embedded BI platform called IntelliPlex that allows its manufacturing customers to quickly and easily view and manipulate data to gain business insights.

IntelliPlex, rebranded from the WebFOCUS InfoAssist platform from business intelligence provider Information Builders, uses a set of web-based design tools that lets people drag and drop data elements into custom reports and dashboards. Because it is a cloud-based manufacturing ERP software, companies can avoid a large capital expenditure for IT hardware and upgrades are not required because new features and functions are available immediately.

For manufacturers, such a cloud-based ERP module with built-in BI can be embedded into a larger solution or rebranded as its own application.

This means even non-technical users in a manufacturing business can filter information, visualize it through interactive displays, and create reports, charts, scorecards, and dashboards to share with others – all without assistance from the IT department. Many manufacturers use the tool to create reports on manufacturing metrics, key performance indicators (KPIs), scorecard data and cash flow. As reports are created, they can be automatically e-mailed to other users or set up for scheduled distribution. In short, BI is seamlessly integrated into the business’s operation.

In addition, many manufacturers define their own KPIs to assist with traceability, which involves finding problems at the source and then taking action to remediate those problems.

For example, if a brake manufacturer learns about a bad shipment of brake pads, it can use IntelliPlex to trace the serial number and then track the associated lots through the manufacturing process to determine where the defect originated. A sales manager could follow the same approach to examine results by region or sector, and then drill down to individual invoices to determine which sales people are performing well.

Other reports assist with inventory management by keeping workers apprised of projected and actual quantities during manufacturing, assembly, and distribution processes.

Traceability is critical in manufacturing, an industry in which businesses must coordinate product recalls and accountability issues. BI reports help them to uncover problems, maximize profitability and extend best practices inside the business from department to department and externally, with partners and customers.

Cloud-based BI platforms can also help manufacturers solve analyzing operational metrics.

A typical mid-sized manufacturer might have hundreds of machines working around the clock. Assembly lines generate information from networked sensors that can be used for equipment configuration, troubleshooting, quality control, and maintenance purposes. Each machine captures an immense volume of data at each stage of the manufacturing process. Manufacturers need to continually examine this data to circumvent problems and keep the operation running at peak capacity.

Manufacturing and Data Discovery

The kind of real-time, ground level insight offered by a cloud-based ERP platform with business intelligence capabilities helps manufacturers manage information in a way that benefits every user in the organization, from the shop floor to the CEO’s office.

Information is key to data discovery. Data discovery is a term commonly used to describe the process of collecting and consolidating information from various back-end systems and sources, then using interactive dashboards to manipulate and analyze it to uncover patterns, trends, relationships, and anomalies.

BI in the cloud and easy-to-use data discovery tools can erase barriers to business intelligence adoption in manufacturing. Among the most common are the absence of resources, lack of budget, and a lack of awareness of BI and analytics, according to Ventana Research.

But the continuous innovation of emerging technologies can speed the rate at which manufacturing companies become adopters of BI platforms. Through better management of their growing data assets, organizations can create an analytics-driven culture to improve business performance, foster innovation to transform processes, and share strategic insights.

Dresner Advisory Services publishes new Wisdom of Crowds Collaborative Business Intelligence market study

Dresner Advisory Services publishes new Wisdom of Crowds Collaborative Business Intelligence market study
Users Demonstrate a Renewed Interest in Collaborative Business Intelligence Capabilities, According to this Report.

Dresner Advisory Services today launched its 3rd annual Wisdom of Crowds® Collaborative Business Intelligence Market Study. The 2014 edition of this report explores deployment trends and user sentiment for Collaborative Business Intelligence, a process where people and organizations work together to develop a common understanding that is shared and used to support better decision making across the organization.

The new report builds upon the 2012 and 2013 editions to include more year-over-year comparisons, which, in addition to several new analyses, provide a valuable tool for anyone evaluating or investing in collaborative-based business intelligence (BI) products and services.

“We are excited to release our latest Collaborative BI Market Study which has uncovered several fascinating user trends,” said Howard Dresner, founder and chief research officer at Dresner Advisory Services. “For example, this year there is a rebound in interest in Collaborative BI, placing it above such high-profile topics such as big data and social media. In addition, we see a much closer alignment between the tools and capabilities that users want and the features and functions that vendors are incorporating into their offerings.”

“In 2012, Dresner Advisory was the first analyst organization to explore the topic of Collaborative BI, and this year’s analysis proves that the Collaborative BI dynamic is indeed a trend worthy of increased examination and analysis,” continued Mr. Dresner.

The Wisdom of Crowds Collaborative Business Intelligence Market Study is based on primary research conducted in February and March 2014. A wide range of BI users and implementers from around the world contributed via an online survey to gauge their current usage as well as intended deployment strategies for Collaborative BI capabilities.

For Mobile BI, small retail sales organizations should be your target

This would be according to a 2013 study published by Dresner Advisory Services, a research and advisory firm that has been offering compelling research in the area of Business Intelligence since 2007, they published their first Wisdom of Crowds BI Market study in 2010, since then they have been publishing this yearly study which is much awaited by the industry.

Here is a roundup of one of their 2013 study titled “Wisdom of Crowds – Mobile Computing / Mobile Business Intelligence Market Study “.

“In 2013 those stating that Mobile BI is “critical” or “very important” declined from 61% to 57%. This is believed to be normal, as organizations move from the planning stages to deployment. “
“From a geographical perspective, the importance of mobile BI is far greater in developing countries than in North America and EMEA. This is due to the leap-frog effect of cell phones in developing nations; those without traditional BI are investing in Mobile BI first.”

“From a vertical industry perspective, Retail & Wholesale assign Mobile BI the highest importance and have been amongst the most ambitious in its adoption, particularly in store and field management. Other industries that placed a relatively high importance on Mobile BI are Technology and Food, Beverage & Tobacco.”

“Small organizations quickly embrace Mobile BI in comparison to larger enterprises. Nearly twice as many small organizations ranked it as critical vis-a-vis larger ones.“

“Those users whose roles require them to be “nomadic” (outside of the office more than inside) tend to be amongst the most well prepared for Mobile BI, as mobile technology has historically been mandatory to do their jobs. Hence, executives and Sales & Marketing are the most prepared and Finance and IT are the least prepared.”

Vertical industries, “those that are most nomadic should also be the most culturally prepared.

However, this is only partially true. Those that perceive themselves to be prepared must also be willing to change and innovate. Key industries such as Retail have been amongst the earliest adopters of Mobile BI. In contrast, Healthcare appears less well prepared.”

Read the complete study here .

Source – Wisdom of Crowds – Mobile Computing / Mobile Business Intelligence Market Study 2013.

Category – Mobile BI

Four banking challenges that business intelligence solution solves.

Banking industry face four major challenges; vast range of customer and varied data related to their transactions, ongoing regulatory changes, increasing consolidation and competition. To address these challenges, banks employ BI solution that helps them to make better business decisions and to better target performance goals.

Let’s look at each of them,

Data explosion – Banks handle immense amount of information, it is hard to keep track of important information and understand which information is important. This information comes from varied sources and formats, it would be a challenging task to make sense of consumer needs, track trends, identify profitable areas and monitor consumer credit from this data.

Consolidation – A host of mergers and acquisitions has resulted in shift in corporate goals and increased focus on managing internal systems. This consolidation activity gives banks an opportunity to greatly reduce overhead costs by integration processes. Banks must identify areas to increase efficiency, cut costs and reduce redundancies.

Regulation – Continued regulatory changes like Basel 2 and SOX Act, requires banks to re-examine many of the operational processes. Banks must integrate the finance and risk segment to comply with these regulations. Banks need information analysis and reporting capabilities to comply with these regulations and manage risks.

Competition – Increased competition is making banks look for ways to differentiate by providing top-quality customer service that cater to individual needs. Expanding customer base results in increased diversity in customer preferences and behaviours. Increased customer diversity means growing consumer demands. Banks need to respond to these demands effectively to ensure they retain customers and gain new-ones.

Each of the above challenges requires banks to be proactive in managing and utilizing corporate data to stay ahead of the competition.

BI solution gives the capability to analyze vast amounts of information to make best business decision. It also allows them to tap into their huge database and deliver easy-to-comprehend insight to improve business performance and maintain regulatory compliance.

BI solution allows companies to easily integrate and cross-reference vast amounts of information from multiple sources, identify relationships among the information, and learn how different factors affect each other.

BI solution allows multiple users to manipulate the data to glean the most from the information that affects their decision making.

BI solution caters to many people in different locations with varied skill levels who need to use this information, everyone from executives who need high-level customized summary data with drill-down capabilities to power users who need to create and design custom reports.

To sum it up, BI solution helps banks to increase revenue while maintaining or reducing costs. Business intelligence software allows banking enterprises to analyze profit and loss, including product sales analysis, campaign management, market segment analysis, and risk analysis. Banks can grow revenue by maximizing customer value over the long term and improving customer acquisition and retention. At the same time, reduce costs by managing risk and preventing fraud, as well as improving operational efficiency.

How Cloud brought BI solutions to the ground and made them attractive for SMBs?

BI solutions when done properly help SMBs compete with medium and large enterprises by offering a level playing field whether in understanding who are its best customers? Which are the most profitable products or services? Which are the most efficient locations? How much it cost to launch a new product or a territory? Which marketing activity is offering the highest or the lowest return? It offers insights into the cost-of-acquiring a new customer and how those costs are related to ‘customer gain or loss. In short, it removes hunches and hindsight out of decision making and offers a logical data based answers.

[However, one of the recent studies reported, smaller the company, the less likely they are to use or plan to use BI solutions. While 33% of midsize businesses currently use and 28% plan to use BI solutions, among small businesses, just 16% currently use and 16 % plan to use BI solutions.]

So, what got changed in the “big boy’s crystal ball” that now makes it attractive for the small and medium business? the reason being BI are more affordable and easy to use, newer technologies such as open source, cloud, in-memory technology, Web 2.0 interfaces, and new visualization technology are making BI tools much more friendly to SMBs.

For the current discussion, let’s look at why SMBs are increasingly looking at Cloud and Saas based BI solutions to overcome their data mining problem.

Affordable and Easy to Use – A typical BI solution, even in the last few years, was priced way above what a small and medium business could afford, add to this the complication of deploying and managing it, just forget it. Then came the Cloud and SaaS based BI solutions, BIRST,  Indicee (Sales related BI), GoodData, Kognito, PivotLink… which “pulled the rug out from under” the BI majors, SMBs found these solutions easy to use and affordable, they could save on cost of deployment and managing servers and network connections, many of the solutions were so easy they could be deployed with or without a consultant. A typical cost per user of a Lite version could be as low as $25 and for Professional and Enterprise versions in the range of $75-90. Tibco Spotfire and Tableau Software, for example, offer no-cost and low-cost tools that let users develop and share easy-to-understand data visualizations. BIRST, Adaptive Insights, and PivotLink are among a handful of on-demand BI systems that can subscribe to online.

Don’t need a full time IT professional to manage it – any BI solution that requires a full-time resource would be unaffordable for a small and medium enterprise, self serve analytics which are part of most of the SaaS and Cloud based BI solution are most attractive options.

Mobile based workforce – with a major portion or even a part of SMB organization being mobile, they had little use for desktop or laptop based BI solutions, they looked for Cloud based mobile solutions which offered mobility at no extra cost.

The challenge for SMBs is acquiring BI software on slim technology budgets, then deploying and maintaining systems with limited IT support, Cloud and SaaS based BI solutions answer this call.

Let us hear from your experience, are you a SMB using a BI solution, let’s know how you are doing?

Your Big Data Is Worthless if You Don’t Bring It Into the Real World

“Many now seem convinced that the best way to make sense of our world is by sitting behind a screen analyzing the vast troves of information we call “big data.”

In a generation, the relationship between the “tech genius” and society has been transformed: from shut-in to savior, from antisocial to society’s best hope. Many now seem convinced that the best way to make sense of our world is by sitting behind a screen analyzing the vast troves of information we call “big data.”

Just look at Google Flu Trends. When it was launched in 2008 many in Silicon Valley touted it as yet another sign that big data would soon make conventional analytics obsolete.

But they were wrong.

Not only did Google Flu Trends largely fail to provide an accurate picture of the spread of influenza, it will never live up to the dreams of the big-data evangelists. Because big data is nothing without “thick data,” the rich and contextualized information you gather only by getting up from the computer and venturing out into the real world. Computer nerds were once ridiculed for their social ineptitude and told to “get out more.” The truth is, if big data’s biggest believers actually want to understand the world they are helping to shape, they really need to do just that.

It Is Not About Fixing the Algorithm

The dream of Google Flu Trends was that by identifying the words people tend to search for during flu season, and then tracking when those same words peaked in the real time, Google would be able alert us to new flu pandemics much faster than the official CDC statistics, which generally lag by about two weeks.

For many, Google Flu Trends became the poster child for the power of big data. In their best-selling book Big data: A Revolution That Will Transform How We Live, Work and Think, Viktor Mayer-Schönberger and Kenneth Cukier claimed that Google Flu Trends was “a more useful and timely indicator [of flu] than government statistics with their natural reporting lags.” Why even bother checking the actual statistics of people getting sick, when we know what correlates to sickness? “Causality,” they wrote, “won’t be discarded, but it is being knocked off its pedestal as the primary fountain of meaning.”

But, as an article in Science earlier this month made clear, Google Flu Trends has systematically overestimated the prevalence of flu every single week since August 2011.

And back in 2009, shortly after launch, it completely missed the swine flu pandemic. It turns out, many of the words people search for during Flu season have nothing to do with Flu, and everything to do with the time of year flu season usually falls: winter.

Now, it is easy to argue – as many have done – that the failure of Google Flu Trends simply speaks to the immaturity of big data. But that misses the point. Sure, tweaking the algorithms, and improving data collection techniques will likely make the next generation of big data tools more effective. But the real big data hubris is not that we have too much confidence in a set of algorithms and methods that aren’t quite there yet. Rather, the issue is the blind belief that sitting behind a computer screen crunching numbers will ever be enough to understand the full extent of the world around us.

Why Big Data Needs Thick Data

Big data is really just a big collection of what people in the humanities would call thin data. Thin data is the sort of data you get when you look at the traces of our actions and behaviors. We travel this much every day; we search for that on the Internet; we sleep this many hours; we have so many connections; we listen to this type of music, and so forth. It’s the data gathered by the cookies in your browser, the FitBit on your wrist, or the GPS in your phone. These properties of human behavior are undoubtedly important, but they are not the whole story.

To really understand people, we must also understand the aspects of our experience — what anthropologists refer to as thick data. Thick data captures not just facts but the context of facts. Eighty-six percent of households in America drink more than six quarts of milk per week, for example, but why do they drink milk? And what is it like? A piece of fabric with stars and stripes in three colors is thin data. An American Flag blowing proudly in the wind is thick data.

Rather than seeking to understand us simply based on what we do as in the case of big data, thick data seeks to understand us in terms of how we relate to the many different worlds we inhabit. Only by understanding our worlds can anyone really understand “the world” as a whole, which is precisely what companies like Google and Facebook say they want to do.

Knowing the World Through Ones and Zeroes

Consider for a moment, the grandiosity of some of the claims being made in Silicon Valley right now. Google’s mission statement is famously to ”organize the world’s information and make it universally accessible and useful.” Mark Zuckerberg recently told investors that, along with prioritizing increased connectivity across the globe and emphasizing a knowledge economy, Facebook was committed to a new vision called “understanding the world.” He described what this “understanding” would soon look like: “Every day, people post billions of pieces of content and connections into the graph [Facebook’s algorithmic search mechanism] and in doing this, they’re helping to build the clearest model of everything there is to know in the world.” Even smaller companies share in the pursuit of understanding. Last year, Jeremiah Robison, the VP of Software at Jawbone, explained that the goal with their Fitness Tracking device Jawbone UP was “to understand the science of behavior change.”

These goals are as big as the data that is supposed to achieve them. And it is no wonder that businesses yearn for a better understanding of society. After all, information about customer behavior and culture at large is not only essential to making sure you stay relevant as a company, it is also increasingly a currency that in the knowledge economy can be traded for clicks, views, advertising dollars or simply, power. If in the process, businesses like Google and Facebook can contribute to growing our collective knowledge about of ourselves, all the more power to them. The issue is that by claiming that computers will ever organize all our data, or provide us with a full understanding of the flu, or fitness, or social connections, or anything else for that matter, they radically reduce what data and understanding means.

If the big data evangelists of Silicon Valley really want to “understand the world” they need to capture both its (big) quantities and its (thick) qualities. Unfortunately, gathering the latter requires that instead of just ‘seeing the world through Google Glass’ (or in the case of Facebook, Virtual Reality) they leave the computers behind and experience the world first hand. There are two key reasons why.

To Understand People, You Need to Understand Their Context

Thin data is most useful when you have a high degree of familiarity with an area, and thus have the ability to fill in the gaps and imagine why people might have behaved or reacted like they did — when you can imagine and reconstruct the context within which the observed behavior makes sense. Without knowing the context, it is impossible to infer any kind of causality and understand why people do what they do.

This is why, in scientific experiments, researchers go to great lengths to control the context of the laboratory environment –- to create an artificial place where all influences can be accounted for. But the real world is not a lab. The only way to make sure you understand the context of an unfamiliar world is to be physically present yourself to observe, internalize, and interpret everything that is going on.

Most of ‘the World’ Is Background Knowledge We Are Not Aware of

If big data excels at measuring actions, it fails at understanding people’s background knowledge of everyday things. How do I know how much toothpaste to use on my toothbrush, or when to merge into a traffic lane, or that a wink means “this is funny” and not “I have something stuck in my eye”? These are the internalized skills, automatic behaviors, and implicit understandings that govern most of what we do. It is a background of knowledge that is invisible to ourselves as well as those around us unless they are actively looking. Yet it has tremendous impact on why individuals behave as they do. It explains how things are relevant and meaningful to us.

The human and social sciences contain a large array of methods for capturing and making sense of people, their context, and their background knowledge, and they all have one thing in common: they require that the researchers immerse themselves in the messy reality of real life.

No single tool is likely to provide a silver bullet to human understanding. Despite the many wonderful innovations developed in Silicon Valley, there are limits to what we should expect from any digital technology. The real lesson of Google Flu Trends is that it simply isn’t enough to ask how ‘big’ the data is: we also need to ask how ‘thick’ it is.

Sometimes, it is just better to be there in real life. Sometimes, we have to leave the computer behind.

 

 

Smart thinking by airlines and airports

100% of airlines and 90% of airports are investing in business intelligence solutions to provide the intelligent information across their operations. This is according to Smart Thinking, released by SITA today at CAPA’s Airlines in Transition Summit in Dublin.

DUBLIN, Ireland – More than half of passengers would use their mobiles for flight status, baggage status and airport directions and by 2016 the majority of airlines and airports will offer these services. In total, 100% of airlines and 90% of airports are investing in business intelligence solutions to provide the intelligent information across their operations which these, and other services, demand. This is according to Smart Thinking, released by SITA today at CAPA’s Airlines in Transition Summit in Dublin.

SITA, the IT and communications provider to the air transport community, regularly conducts global research on airports, airlines and passengers. This provides the unique opportunity to look across the entire industry and identify alignment, misalignment, and potential for acceleration. SITA’s Smart Thinking is based on this global research and incorporates additional input from leading airlines and airports including British Airways, Saudia, Dublin Airport Authority, London City Airport and Heathrow.

According to SITA’s paper, flight status updates are already a mainstream mobile service and will extend to the vast majority of airlines and airports by the end of 2016. By then, what today are niche services will also be well established. Bag status updates will be offered by 61% of airlines; and 79% of airports will provide status notifications, such as queues times through security and walking time to gate. More than three quarters will also be providing navigation/way-finding at the airport via mobile apps.

Nigel Pickford, Director, Market Insight, SITA, said: “Our research has clearly shown that the move to smartphone apps and mobile services is well underway. But many of the services that airlines and airports are planning are heavily dependent on their ability to provide more meaningful data and insight – providing passengers and staff the right information at the right time. Efforts are being made across the industry to collaborate and SITA has established the Business Intelligence Maturity Index to benchmark the progress.”

Pickford continued: “We asked airlines and airports to measure themselves in four categories of business intelligence best practice for this index: Data Access and Management; Infrastructure; Data Presentation; and Governance. Our analysis shows that on average the industry is only halfway to achieving best-in-class and further progress is needed.”

There are ongoing efforts across the industry to establish data standards and ensure system compatibility. Pickford added: “Though the picture is not perfect now, change is coming. All airlines and 90% of airports are planning to make business intelligence investments in the coming three years. Both face the issue though that while passengers are very keen to access information about their journey, they are also sensitive about privacy. The smart use of non-intrusive passenger information however will provide benefits to airlines and passengers.”

SITA’s report describes how today the focus is on building the foundation for business intelligence but looking ahead the combination of business intelligence plus predictive analysis will help improve the passenger experience, while optimizing the use of infrastructure and space at airports. In the past, airlines and airports had no choice but to react when “irregular” events such as bad weather disrupted their finely-tuned schedules. Using business intelligence they will be more proactive by analyzing past events and combining live data feeds from multiple sources to predict future events and take preventative action before they occur. By making the transition from reactive to proactive to preventative there are significant benefits to be gained for passengers and the industry alike.