Author Archives: admin

Enterprise Mobility, key to your business strategy.

Tony Storr, an Architect and Leader, IBM Mobile at Scale, writing in Mobile Business Insights talks about the necessary factors a C-suite  executive should focus on if enterprise mobility has to play a key role in their business strategy.

Tony Storr starts off how enterprises in their anxiety to get on the mobility bandwagon encouraged various business units to follow their own roadmap and even though it felt empowering at that time led to some serious issues over time. These issues like varying standards, system fragmentation, security implications and provider volatility resulted in uneven user experience. But, the real business issue was when it was time for digital transformation, these issues being key factors effecting pace of change, posed serious challenges, especially as speed to market was crucial. This scenario is widespread irrespective of the industry and geography.

The challenge for the enterprises is to consolidate and industrialize the mobile applications without encroaching upon the innovation aspect brought in by the business unit. This entails much more than having a single vendor or internal team churning out applications in a consistent manner. Enterprises embarking upon consolidation should not forget that consolidation need to happen holistically and continuously across portfolios of apps and also across mobile app services (design, develop, maintain, support and monitor).

The key characteristics of effective mobile services are, continuous operation, increasing assets and accelerators, increasing productivity, cross app design and architecture, integrated support functions and innovative tooling and techniques.

In conclusion, Tony Storr writes that consolidation and refactoring of all existing applications may not be advisable and suggests this may start with the next generation of employee apps and enterprise mobility may well take off from there.

To know more, click – http://mobilebusinessinsights.com/2017/03/get-serious-with-enterprise-mobility/

How machines learn?

Writing in Datanami, a news portal dedicated to Big Data news, insights and analysis, Fiona McNeill, a SAS global marketer and Dr. Hui Li,  senior staff scientist for SAS, shares light on how machines learn. Starting with examples of various enterprises using machine learning to design personalized offerings to attract customers, they then raise the important point of different vendors jumping on the bandwagon of machine learning with their own approaches and solutions making the whole thing confounding to the user. And thru’ this article in Datanami, the duo from SAS try to unravel machine learning and make it easier for users to understand how exactly machine learning works.

Machine learning models are designed to learn how to perform tasks and with algorithms designed to see relationships and patterns between various factors, these models learn continuously from data.  And to generalize this model for business they are then validated on whole set of new data not used initially for training. These models may be made to learn in different ways like supervised learning, semi-supervised, unsupervised and reinforcement learning.

Machine learning is at the center of many advanced intelligent solutions emerging now, like AI, Neural networks, Natural Language Process and Cognitive Computing.

  • Artificial Intelligence – A discipline enabling the design of machine with problem solving skills to accomplish tasks just as human beings can.
  • Neural Networks and Deep Learning – Neural networks are programs written to learn from observational data and present solution to the problem on hand. These are used in speech and image recognition and are very successful in supervised learning.
  • Natural Language Processing and Cognitive Computing – NLPs are interfaces which enable machines to understand human language and humans to interpret machine output. These are applied in image captioning, text generation and machine translation.

The confluence of Big Data and massive parallel computational environments are driving the machine learning initiatives and the goal is to deliver solutions that can be highly customizable and with human – like cognition features.

For more on this, please visit: https://www.datanami.com/2017/01/31/intelligent-machines-learn-make-sense-world/

2016, Big Year for Big Data

Writing recently in insideBigData, Linda Gimmeson, a technical writer focusing on Big Data, machine learning and IoT, takes a look back on the year 2016, and tells how Big Data contributed technologically and socially, she then sets out to make a list of six disciplines which has benefited by the application of Big Data.

AI advancement – Big Data is advancing the speed and capacity of Artificial Intelligence and taking it to the next level, for example  Google DeepMind AI beating humans in the game of Go, and becomes unbeatable as the game progresses due to AI and the Big Data applied to its functioning.

Tax Shelters Unveiled – Investigative journalists collaborating across continents and using cloud based data analytics and Big Data were able to pursue effectively and unveil tax shelters now famously known as “Panama papers”.  This is one of the first known instance of the real-world good, Big Data can contribute to bring about.

Human Trafficking – Big Data is lending its helping hand to “Polaris Project” in its fight against human trafficking, even though Polaris project has made tremendous progress over the years  in their fight, Big Data became their strongest tool in the year 2016 to decipher the complex numbers and patterns to give useful insights and help victims of this  horrific crime.

Cancer Research – Intel’s Bryce Olson , himself a cancer survivor lead, Trusted Analytics platform is “ a collection of Big Data tools and data analytics, to help in breaking down of DNA – the complex code of human genetics to give insights into where cancer begins and how it can be controlled”.

HIV Outbreaks – When Centre of Disease Control were struggling to contain the HIV outbreak in 2016, which was taking extensive death toll and seriously damaging the health of survivors, they turned to Big Data for insights to fight the outbreak.

In 2016 we were witness to the fact that Big data in addition to its applications in optimizing business efficiency and effectiveness can also be deployed to harness real- world good, and we are sure to see more such deployments in the coming years.

For more on this, please read: http://insidebigdata.com/2017/02/26/2016-big-year-big-data/

6 Ways Business Intelligence is going to change in 2017

Ralph Tkatchuk – a freelance security consultant writing in Dataconomy, a leading portal for news, events and opinion on data driven technology talks about six ways the business intelligence is going to change in 2017.

Writing how decision based on data is more effective than those based on intuition, perception and assumptions, he goes along to tell that businesses driven by data are five times more likely to make faster decisions than their peers. And they are twice more likely to land up in the top quartile of financial management within their industry.

Till very recently only large enterprises with access to sophisticated business intelligence tools and ability to collect vast amount of data were beneficiaries of this data driven strategy, business also had to invest in analytical solutions and data scientist to convert this data into useful information leaving out large chunk of small and medium business out of its sphere of influence, but this is going to change in 2017.

In their bid to be on par with large enterprises on taking advantage of data driven technologies, SMBs are turning to self – sufficient business intelligence tools. With intuitive interfaces, astute data preparation tools and this coming at very low price points allow SMBs to be their own data scientists.

In light of this development, Ralph Tkatchuk expects business intelligence to change in 2017 due to the following,

  • Affordable Access – Complex data analytics becoming more cost effective and hence accessible to SMBs.
  • Smart Integration – BI is becoming available thru’ more integrations like messaging services and IOT, and BI is moving to on – demand service and helping SMBs.
  • Simplified Analytics – With comprehensive solution for back end number crunching and front virtualization, BI is commoditized to be within the reach of SMBs.
  • Cloud based data – The adoption of cloud based data- warehousing is contributing to SMBs BI self-sufficiency.
  • Evolved Visualization – The new self – sufficient BI tools offer data visualization in interactive and real time mode spurning dashboards which one can drill down.
  • Collaboration – With BI becoming accessible, SMBs may employ cross – team collaboration to increase effectiveness and efficiency.

All these above advances are making SMBs adapt to BI extensively in the coming days.

To read more on this subject, visit: http://dataconomy.com/2017/02/6-ways-business-intelligence-changes

Top 10 Artificial intelligence Technologies

Gil Press – a regular contributor for Forbes on technology, entrepreneurs and innovation writes about how the market for artificial intelligence technologies is flourishing and the top ten artificial intelligence technologies we will witness making a splash in the market.

Quoting figures from various market research agencies, Gil Press starts off by showing how artificial intelligence markets is growing and will continue to grow in the coming years, for instance he cites Narrative Science survey to tell that 38 % of all enterprises are using AI in some form or other and this reach 62% by the year 2018, or IDC estimates the AI market to grow from 8 Billion in 2016 to 47 Billion by 2022.

Artificial Intelligence, the word coined in 1955 to describe one of the disciplines of computer science todays encompasses various technologies of which some have stood the test of time and some new.

Based On Forrester‘s TechRadar Report On Artificial Intelligence, Gil Press lists out his top ten artificial intelligence technologies

  1. Natural Language Generation                                              6.Decision Management
  2. Speech Recognition                                                                  7.Deep Learning Platforms
  3. Virtual Agents                                                                             8.Biometrics
  4. Machine Learning platforms                                                9.Robotic process Automation
  5. AI Optimized Hardware                                                          10.Text Analysis & NLP

As with every technology AI too faces few obstacles for adaption along with some big benefits and according to a survey Forrester   conducted last year, some of the obstacles voiced were – No clear use case – 42%, To clear for what it can used for – 39%, No required skills – 33%, First to invest in data management systems -29%, no budget -23%, AI systems are not proven – 14%,

Forrester concludes that once enterprises overcome these obstacles they will be ready to use AI and gain from using AI in customer facing application and developing an interconnected web of enterprise intelligence.

For more, please visit:  http://www.forbes.com/sites/gilpress/2017/01/23/top-10-hot-artificial-intelligence-ai-technologies/2/#252d8c64179e

On GPUs (Graphical Processing Units)

We are excited to share an article by Eric Mizell in RT Insights.com – a web magazine dedicated to the advances in real time analysis, IOT and Big Data. Writing about how recent advances in input/output devices has pushed performance bottleneck to processing, he writes how the advent of a new technology – Graphical Processing Unit (GPU) with its ability to perform parallel deep analyses in real time is expected to help usher a new era in cognitive computing.

According to Eric Mizell the steady advances in CPU, storage and memory, networking technologies laid foundation for economical cognitive computing. This was pushed further in terms of price and performance in data analytics by the appearance of solid state storage and Random Access Memory (RAM). These advances have managed to shift the performance bottleneck from input/output devices to processing and laid stress on greater processing rate. But this makes cognitive computing and other mature analytical applications the preserve of a few very large organizations with multi-core CPUs deployed in several clusters of servers.

And as an answer to this need for affordable processing power came Graphical Processing Units (GPU)s with its capability for parallel processing bestowing ability to process data 100 times faster than configurations containing cores. GPUs were initially designed for graphics and installed on a separate card with its own memory (Video RAM) and this configuration was popular with gamers who were looking for real-time graphics. And as the processing power and programmability of the GPUs increased over time, it came to be used in additional applications.

We do know the real benefit from cognitive computing can be derived when it is real-time, and this can be achieved economically when we use GPU acceleration.  With variety of analytical processes like artificial intelligence, business intelligence, machine learning, natural language processing, Cognitive computing is best suited for acceleration using GPUs. The cognitive computing workloads with its repeated and similar instructions are well suited for parallel processing by the thousands of small efficient cores of GPUs.

Presently GPU acceleration is being deployed by Amazon and Nimbix, Google is readying to equip its cloud platform with GPUs for Google Compute Engine and Google Cloud Machine Learning Services.

For more, please visit https://www.rtinsights.com/gpus-the-key-to-cognitive-computing/

Text Mining for Claims Management in Insurance

Writing in the Claims Journal, a magazine dedicated to the claims industry, Judith Vaughan and Michael .W. Elliott, experts in claims and risk management, writes how text analytics is helping claims industry in claim processing, fraud detection and in increasing efficiency in claims process.

The authors quotes a report from Accenture which compares the revolution wrought by Big Data to business process to the way internet revolutionized lives in the 1990s. They quote from this same report to say 83% of the enterprises envisage using big data to gain business advantage.

Writing next on how insurance industry especially the claims department is in the forefront of development and deployment of data analytics to increase process efficiency, fraud detection and in claim assortment process. The major challenge for deploying this data analytical process is the prohibitive costs since only about 15% of the data may be converted into actionable information. And the other 85% data which is available in unstructured form is difficult to decipher and extract any useful business information. And this percentage jumps even higher in the claims department with witness reports, claimant statements and other notes being verbose and unstructured in nature.

The insurance industry and the data engineers are looking to harness Text mining or Text analytics to extract valuable data from this 85% of unstructured data. As we know text mining in its basic level is scanning for keywords or phrases and looking for relationships within data but with advances in natural language processing (NLP) and decision logic, it now has the capability for sentiment analysis thru’ which enterprises can discover customer’s opinion about a particular product. In its application for the insurance industry, text mining algorithm can scan the claimant’s social media content in real time to validate information provided in the claims.

Text mining also helps in streamlining the claims processing, thereby increasing process efficiency, and it can also be useful in the development of new products and overcome lacunae in the present offerings.

Another major application of text mining in the insurance industry is in the identification of fraud potential, as majority of frauds are known to happen during claims, Text mining   for example may red flag an instance where several claimants are using the exact same words or sentences in their claims, a sure sign of suspicious activity.

In this way and in many other ways insurers can harness text mining and other data analysis tools to increase process efficiency and bring innovation in terms of product offering or market reach.

For more on this, visit: http://www.claimsjournal.com/news/national/2016/12/05/275316.htm

Technology Trends for 2017

Kasey Panetta, Brand Content Manager for Gartner gives us a first glimpse of the technology landscape in the year 2017. While writing in a blog he expands on the theme of the technology trends expounded by David Cearley, VP and Gartner Fellow in Gartner Symposium/IT Expo 2016, Orlando Florida.

While writing on the technologies trends for 2017, Kasey Panetta starts of writing about the themes underpinning these trends namely intelligent, digital and Mesh. And he then classifies the trends based upon the underlying themes.

Going by the trends, every technology enabled service; application and thing will be in the coming days fortified by intelligence. Since AI and machine learning technologies are in that stage in their evolution that creating systems that can learn, adapt and execute can be easily achieved and most technology vendors would try to do this in the coming years. Riding upon this we will have technologies like deep learning and natural language process spinning off advanced systems. Intelligent apps like virtual personal assistants (VPAs) and the enterprises applications are expected to embrace intelligence and alter both the personal and professional life of individuals. Along with AI and intelligent Apps, existing things including IOT is will get enriched by the power of AI and is expected to transform the existing eco-system.

According to Kasey Panetta, going into 2017, the line between the physical and the digital world will continue to blur and the desire to make the digital world to be accurate reflection of physical world would spin a quite a few business models.  Driving   upon these digital revolutions we will have technologies like virtual reality and augmented reality create immersive environments.  Newer technology like digital twin will gain momentum in the next three to five years, and this initially will be used to simulate and analyze real world conditions. Trending on the developments in digital technology will be Blockchain, a kind of distributed ledger which hold the promise of changing the existing business model especially in music industry.

Defining the final theme Mesh as a dynamic connection of people, processes and services supporting intelligent digital systems .Kasey Panetta goes on to suggest that as this involves changes in user experience, a fundamental change in the supporting technology, architecture and platform will be called upon. Mesh is expected to push forward new technologies like conversational systems, Mesh App and Service Architecture, Digital technology platforms and Adaptive Security Architecture.

For more visit: http://www.gartner.com/smarterwithgartner/gartners-top-10-technology-trends-2017/

Tags : Gartner, IoT, AI

The All Pervasive Cloud

Joe McKendrick – an individual contributor for the Tech Blog of Forbes magazine writing in his latest blog quotes Cisco estimates to tell how cloud computing is becoming all pervasive technology in the coming years. He deploys significant figures from the Cisco estimates estimate to establish this, like with cloud traffic increasing 3.9 zettabytes ( ZT) in 2015 expected to reach 14.1 zettabytes by 2020 and also the likelihood of 92% of the workloads  processed by cloud datacenters with only 8% of the remaining being processed by the traditional data centers in the coming few years.

According to this estimate, Big Data and the associated Internet of Things (IoT) will be the prime drivers for the growth of cloud computing. By 2020, database, analytics and IOT will amount to 22% of the workload against 20% in 2015. The estimated volume of data generated by IoT will be around 600ZT by 2020, which is expected to be 39 times the total projected data center traffic at 15.3 ZT in 2015.

Talking about the composition of the cloud, Cisco estimates by 2020, 68% of cloud  workloads will be in public cloud up from 49% in 2015; this against 32% of cloud  workloads in private cloud down from 59% in 2015. And as Cisco team explains, this shift from private to public cloud will be because of the organization’s deployment of hybrid cloud strategy. According to which private cloud will address the daily computing needs and public will be deployed to address the sudden spurts in demand in traffic.

While Cisco estimates that Software as a Service (SaaS) for online applications and platform as a Service (PaaS) for development tools, databases and middleware is projected to grow, Infrastructure as a Service (IaaS) for online servers, storage is expected to see a drop in demand, and this too according to Cisco analysts will be attributable to the organization’s growing desire to deploy hybrid cloud strategy.

And coming to the consumer usage of cloud, by 2020, cloud storage traffic per user will be 1.7GB/month compared to 513MB/month in 2015, and this will be mainly driven by video and social networking workloads.

For more visit : http://www.forbes.com/sites/joemckendrick/2016/11/13/with-internet-of-things-and-big-data-92-of-everything-we-do-will-be-in-the-cloud/#5fd230a3593f

Tags: IoT, Cloud computing, Cisco, Big data

The Public Cloud – A Two Horse Race

Amazon Web Services – AWS, the public cloud pioneer with its first mover advantage is safely perched on top of the heap with 31% of the market share (this is for Q4 of 2015 for which numbers are publicly available). But going by the way Microsoft with its Azure is notching up numbers, the race between AWS and Microsoft Azure will be a two horse race in a year or two.

AWS, it seems is not sitting smug in its lead; in its bid to entice new customers and hold on the existing ones it has continually added newer services to already existing array of infrastructure as a service (IaaS) and platform as a service (PaaS) features. The recent example being AWS application discovery service which helps its users and system integrators with migration plan for applications. But interestingly AWS has dropped its prices to hold on to the lead raising fears of long term financial detriments.

Microsoft Azure in its quest to leverage its large base of on–premise technologies, is focusing on providing strong management and consistent user experience across platforms. This has rightly played into the enterprises interest to minimize vendors and systems thus creating loyal base of customers whom when fully converted may tip the race in its favor for Microsoft.

Another interesting facet of this race is the contrasting approaches of the competitors to cloud computing, while AWS relies on delivering value added services to its small and medium customers with emphasis on automation and scalability, Microsoft Azure is focusing on integration services between on premise technologies and its public cloud to leverage on its large enterprise customers.

These two leaders who have left the others players to grab the crabs are still to overcome certain challenges to safe –guard their leadership positions, AWS has to overcome the perception of its services being complex and pricing complicated. Microsoft Azure has to effectively counter the dearth of Azure cloud experts and perception that many of its features and functionalities are not ready to be used by very large customers.

For more on this read – The AWS vs. Azure race isn't over yet ( link -

http://fortune.com/2016/08/04/amazon-microsoft- cloud-race/)

Tags: Amazon Web Services, Microsoft Azure, AWS, Cloud Computing