Monthly Archives: January 2017

Top 10 Artificial intelligence Technologies

Gil Press – a regular contributor for Forbes on technology, entrepreneurs and innovation writes about how the market for artificial intelligence technologies is flourishing and the top ten artificial intelligence technologies we will witness making a splash in the market.

Quoting figures from various market research agencies, Gil Press starts off by showing how artificial intelligence markets is growing and will continue to grow in the coming years, for instance he cites Narrative Science survey to tell that 38 % of all enterprises are using AI in some form or other and this reach 62% by the year 2018, or IDC estimates the AI market to grow from 8 Billion in 2016 to 47 Billion by 2022.

Artificial Intelligence, the word coined in 1955 to describe one of the disciplines of computer science todays encompasses various technologies of which some have stood the test of time and some new.

Based On Forrester‘s TechRadar Report On Artificial Intelligence, Gil Press lists out his top ten artificial intelligence technologies

  1. Natural Language Generation                                              6.Decision Management
  2. Speech Recognition                                                                  7.Deep Learning Platforms
  3. Virtual Agents                                                                             8.Biometrics
  4. Machine Learning platforms                                                9.Robotic process Automation
  5. AI Optimized Hardware                                                          10.Text Analysis & NLP

As with every technology AI too faces few obstacles for adaption along with some big benefits and according to a survey Forrester   conducted last year, some of the obstacles voiced were – No clear use case – 42%, To clear for what it can used for – 39%, No required skills – 33%, First to invest in data management systems -29%, no budget -23%, AI systems are not proven – 14%,

Forrester concludes that once enterprises overcome these obstacles they will be ready to use AI and gain from using AI in customer facing application and developing an interconnected web of enterprise intelligence.

For more, please visit:  http://www.forbes.com/sites/gilpress/2017/01/23/top-10-hot-artificial-intelligence-ai-technologies/2/#252d8c64179e

On GPUs (Graphical Processing Units)

We are excited to share an article by Eric Mizell in RT Insights.com – a web magazine dedicated to the advances in real time analysis, IOT and Big Data. Writing about how recent advances in input/output devices has pushed performance bottleneck to processing, he writes how the advent of a new technology – Graphical Processing Unit (GPU) with its ability to perform parallel deep analyses in real time is expected to help usher a new era in cognitive computing.

According to Eric Mizell the steady advances in CPU, storage and memory, networking technologies laid foundation for economical cognitive computing. This was pushed further in terms of price and performance in data analytics by the appearance of solid state storage and Random Access Memory (RAM). These advances have managed to shift the performance bottleneck from input/output devices to processing and laid stress on greater processing rate. But this makes cognitive computing and other mature analytical applications the preserve of a few very large organizations with multi-core CPUs deployed in several clusters of servers.

And as an answer to this need for affordable processing power came Graphical Processing Units (GPU)s with its capability for parallel processing bestowing ability to process data 100 times faster than configurations containing cores. GPUs were initially designed for graphics and installed on a separate card with its own memory (Video RAM) and this configuration was popular with gamers who were looking for real-time graphics. And as the processing power and programmability of the GPUs increased over time, it came to be used in additional applications.

We do know the real benefit from cognitive computing can be derived when it is real-time, and this can be achieved economically when we use GPU acceleration.  With variety of analytical processes like artificial intelligence, business intelligence, machine learning, natural language processing, Cognitive computing is best suited for acceleration using GPUs. The cognitive computing workloads with its repeated and similar instructions are well suited for parallel processing by the thousands of small efficient cores of GPUs.

Presently GPU acceleration is being deployed by Amazon and Nimbix, Google is readying to equip its cloud platform with GPUs for Google Compute Engine and Google Cloud Machine Learning Services.

For more, please visit https://www.rtinsights.com/gpus-the-key-to-cognitive-computing/