Enterprise tech movers for 2017
Security analytics engines digest data from network gear and endpoints in search of anomalies that indicate threats. By setting a baseline for normal, these engines spot out of the ordinary behaviours and assess whether they represent malicious activity.
By incorporating AI and machine learning this technology will expand its ability to detect anomalies not only in network traffic, but in the behaviour of individual machines, users, and combinations of users on particular machines.
As these platforms become more sophisticated and trusted in 2017, they will be able to spot attacks in earlier stages and stop them before they become active breaches.
And the big guns are all involved in making this happen: Cisco with its Tetration Analytics platform, IBM with Watson cognitive computing for cybersecurity; Google/Alphabet with DeepMind lab to name just a few.
Cisco’s Tetration Analytics product is a turnkey package that gathers information from hardware and software sensors and analyses the information using big data analytics and machine learning. In the security realm, the system sets a baseline for normal network and application behaviour and quickly identifies any deviation in communication patterns in real time or uses Tetration’s forensics search engine to look for other security or user behaviour analytics.
“The single most important things customers can do to protect the data centre is set up a whitelist of who has access to what, but it is one of the most difficult tasks to implement,” said Tom Edsall, a senior vice president and CTO with Cisco. “Tetration lets users set up a white list model and policies more quickly and efficiently than they could before.” This capability will address key cybersecurity challenges and move toward the “self-driving data centre” of the future, he said.
Cisco promises many new security-related applications will be layered onto Tetration.
Then we have IBM’s Watson supercomputer, which is being unleashed in corporate networks to analyse traffic in search of malware, but also learning at the same time via its own experiences and by taking in white papers, threat intelligence and news about cybercrime. So over time, Watson will develop new strategies for finding attacks as they unfold. The Watson for Cybersecurity project is in beta now and likely sometime in 2017 could become a full-fledged cybersecurity service.
Separately, there is governmental research underway that could impact the cybersecurity world this year as well. For example, Intelligence Advanced Research Projects Activity, the radical research arm of the of the Office of the Director of National Intelligence, wants to build a system of what it calls sensors that can monitor everything from search terms to social media output to look for early warning signs of cyberattacks.
“Cyberattacks evolve in a phased approach. Detection typically occurs in the later phases of an attack, and analysis often occurs post-mortem to investigate and discover indicators from earlier phases. Observations of earlier attack phases, such as target reconnaissance, planning, and delivery, may enable warning of significant cyber events prior to their most damaging phases,” IARPA wrote in announcing its Cyberattack Automated Unconventional Sensor Environment (CAUSE) program.
“It is expected that the technology developed under the CAUSE Program will have no ‘human in the loop.’ Experts may help develop, train, and improve the solution systems, but they will not manually generate warnings, guide the system, or filter warnings before they are delivered to the [IARPA] Team. The performer produced warnings must be machine-generated and submitted automatically…” IARPA wrote of the system.
Bullish for blockchain
There’s no shortage of hype around blockchain’s potential to revolutionise transactions. Heading into the new year, some enterprises will put blockchain hype to the test as they start exploring its ability to reduce transaction costs, streamline partner interactions, and accelerate business processes.
Blockchains are distributed public ledgers, lauded for their ability to establish trust in the digital world by way of verifiable transactions and without the need for a middleman. The cryptocurrency bitcoin is the most familiar application. In the financial world, blockchains are expected to disrupt how financial institutions conduct payments and wire transfers, process securities trades, and handle compliance reporting, to name just a few use cases.
Outside of finance, industry watchers cite opportunities for blockchains to play a role in core business functions from supply chain and manufacturing to legal and healthcare. When there’s an audit trail required — to track the provenance of finished goods, for example, or to document a real estate title — blockchain networks can be used to create verifiable, tamper-proof records in an encrypted format and without having a central authority.
Enterprise IT leaders “are not so much interested in secure, anonymous public networks like bitcoin but in closed networks that are between specific groups of people, particularly between enterprises that have to interact,” says Roger Kay, founder and president of market intelligence firm Endpoint Technologies Associates.
In a blockchain, each page in a ledger of transactions forms a block, which is linked via a cryptographic hash to the previous block, and new transactions are authenticated across the distributed network before the next block is formed. “Blocks are always agreed on, and each one has an encrypted representation of everything that happened before, so you can tell it’s authentic. You can’t tamper with the chain at any point,” Kay says. As a trust system, “it essentially eliminates the need for a third-party guarantor.”
That is not to say blockchain technology is mature, however. “It’s still early days,” Kay warns.
Early adopters have launched hundreds of pilot projects, but there’s a long way to go before blockchain hits mainstream adoption. Among the obstacles blockchain deployments face are: technical challenges, lack of standards and governance models, shortage of skills, and scalability concerns.
As 2016 closes, vendors continue to devise distributed applications and platforms based on blockchain technology, and venture capital firms continue to pour money into the effort. More than $1.4 billion (€1.34 billion) has been invested in blockchain technology over the past three years, according to an August report by the World Economic Forum (WEF). More than 90 corporations have joined blockchain development consortia, and more than 2,500 patents have been filed. The WEF predicts that by 2017, 80% of banks will initiate projects that involve distributed ledger technology.
For enterprises interested in exploring how they can use blockchain and distributed ledgers, research firm Gartner recommends starting with limited-scope trials that are aimed at specific problems. Enterprises can start to investigate how distributed networks might improve business processes that are constrained by transaction inefficiency and how technology suppliers might be able to help.
“The challenge for blockchain users and CIOs is to set appropriate expectations among business leaders,” Gartner writes in its 2017 strategic predictions report. “Plan for a reasonable rollout, failure and recovery (especially through 2018); develop realistic proof of concept (POC) use cases; and be agile from an IT and business perspective to follow the best path to success.”
Machine Learning — the promise of predicting the future
Historically, the challenge for organisations that want to use machine learning and cognitive computing technologies has been that it requires hiring expert data scientists who have spent their careers studying how to crunch data into artificial intelligence algorithms.
In recent years, thanks to the proliferation of public cloud computing platforms, that’s changing. Companies like Amazon Web Services, Google, Microsoft and IBM have all rolled out cloud-based machine learning platforms. “It’s really lowered the barrier quite a bit,” says Sam Charrington, an analyst and blogger who tracks the machine learning market, adding that the technology is being democratised for everyday developers to use in their applications.
At its most basic level, machine learning is the process of using data to make predictions of future behaviour. Most commonly it’s been used in fraud protection (training computers to detect anomalous behaviour) and teaching programs to predict future revenues and customer churn. IBM has trained its Watson platform to create sophisticated chatbots for customer interaction and to help healthcare workers provide better care.
It is still early days for adoption though: A recent study by consultancy Deloitte reported that only 8% of enterprises use machine learning technology today. Allied Market Research predicts the industry is growing at a 33% compound annual growth rate and will reach $13.7 billion (€13.2 billion) by 2020.
“The practice of employing algorithms to parse data, learn from it, and then make a determination… is gathering speed,” reports 451 Researcher Krishna Roy. Consumer adoption of platforms like Amazon’s Echo and Apple’s Siri has seeded this market, but enterprise adoption has been held back by a lack of market education and integration of these systems with existing enterprise platforms. But, she notes that one day this technology could become a “fundamental part of an enterprise’s analytics fabric.”
IDG News Service
Subscribers 0
Fans 0
Followers 0
Followers