Posted On:October 2016 - AppFerret
Cannabis-specific data services are optimizing business and educating consumers
When it comes to growing a plant-based industry, big data and technology may be more valuable than fertilizer.
As the legal marijuana industry continues to expand, so has the need for data services to increase efficiency and consumer education. But companies like SAPSAPGF, +0.18% and NetSuite N, -1.39% that typically provide data collection and organization to more traditional industries aren’t extending their business to the legal cannabis industry, creating an opening for new companies catering just to the marijuana market to fill the space.
“They’re bringing normal business processes into the cannabis industry,” says Alan Brochstein, founder of 420 Investor, an investor community for publicly traded cannabis companies. “It’s about time.” SAP and NetSuite didn’t respond to requests for comment.
With the legal cannabis market expected to reach $6.7 billion in medical and recreational sales in 2016, industry experts expect data to play a primary role in accelerating this growth. These data services track everything from plant cultivation to consumer purchasing trends, helping marijuana retailers comply with state regulations and optimize their inventory to meet demand.
One of those services, Flowhub, was founded in late 2014 as a cannabis-tracking software to help marijuana growers operate in compliance with state regulations. Before that, many manufacturers had a blind spot when it came to supply chain management, opening them up to potential missteps like missing plants or pesticide use, says Kyle Sherman, the company’s chief executive. “We wanted to provide tools to help people so we can legalize [cannabis] responsibly,” Sherman says.
About 100 companies in Colorado and Oregon are signing onto the Flowhub system, according to Sherman. The software system also allows for other data applications to be incorporated with it, similar to how mobile apps work within smartphones.
One of these data applications is Headset, which launched nearly two months ago but is already tracking $65 million worth of cannabis transaction information, according to Cy Scott, the company’s chief executive. Scott described Headset as “the Nielsen of cannabis,” providing market intelligence data for marijuana sellers including guidance ranging from how to stock inventory to how to price products. “It was almost obvious,” Scott says. “Every other retail industry has this service.”
The Headset application can tell retailers specific details which they can use to base inventory decisions off, like if granola edibles are outselling caramel edibles, and overall trends, like the decline in popularity of marijuana flowers — the smokable form of the plant — Scott says. “Our customers range from the largest retailers to the newest retailers in the industry,” he adds.
There are also data resources for consumers. Leafly, a cannabis information resource website, uses crowdsourced data to provide reviews of strains and dispensary directories to help customers navigate the legal marketplace. The gradual legalization of the industry has brought a new source of community-based feedback, says Zack Hutson, director of public relations at Privateer Holdings, a cannabis private-equity firm that owns Leafly, adding that the site had about 9 million unique visitors in February.
Cannabis Reports also provides a comprehensive database for the strains of cannabis on the legal market. The company’s chief executive, David Drake, says he brought the website online after noticing the absence of tech services within the legal cannabis industry. The database includes more than 30,000 strains of marijuana, the companies that produce them, the lab tests performed on them, medical studies and other information gathered from online research.
“It’s a really big responsibility to have that amount of data, and we’re making it available in a very open fashion” Drake says. “We’re looking to try and serve anybody trying to find out about cannabis.”
The company provides free information for consumers on its websites, and businesses can pay a monthly fee for customer insight data and data organization like charts and pricing information. “It makes people a lot more comfortable about the industry when you know all the data is there and it’s all transparent,” Drake says.
This transparency may be crucial for the industry as legalization movements across the country continue to gain steam. Much of the negative reputation marijuana has garnered in past decades has been drawn from “false data,” says Flowhub’s Sherman. “What we really need to squash prohibition is great data.”
Original article here.
Want your IoT devices to last a billion years? Use an AA battery, or just suck the leakage out of your transistors, according to a new paper in the American Association for the Advancement of Science journal Science, published in the October 21 issue.
The two researchers, Sungsik Lee, Arokia Nathan, are both in the Electrical Engineering Division of the Engineering department of the University of Cambridge in the UK, have designed new ultralow power transistors that, if all goes as described, could function for years without a battery. They operate on ‘scavenged’ energy from their environment, and therefore should be able to power devices for months or years without a battery, and provide enough juice for wearable or implantable electronics.
It uses a principle similar to sleep mode, much like other low-power devices, but adds in the ability to harness electrical near-off-state current for its operations. This energy leakage is apparently common to all transistors, but this is the first time that it has been effectively captured and used functionally, the researchers said.
The transistors can be produced at low temperatures and can be printed on almost any material, from glass and plastic to polyester and paper. They are based on a unique geometry which uses a ‘non-desirable’ characteristic, namely the point of contact between the metal and semiconducting components of a transistor, or the ‘Schottky barrier.’
“We’re challenging conventional perception of how a transistor should be,” said Nathan. “We’ve found that these Schottky barriers, which most engineers try to avoid, actually have the ideal characteristics for the type of ultralow power applications we’re looking at, such as wearable or implantable electronics for health monitoring. This will bring about a new design model for ultralow power sensor interfaces and analogue signal processing in wearable and implantable devices, all of which are critical for the Internet of Things.”
The new design also addresses the issue of scale. As transistors get smaller, the electrodes will start to influence the behavior of one another and the voltages spread, so usually transistors fail to function below a certain size. With this design, the researchers were able to use the Schottky barriers to keep the electrodes independent from one another, so that the transistors can be scaled down to very small geometries.
The design also achieves a high level of gain, or signal amplification. The transistor’s operating voltage is less than a volt, with power consumption below a billionth of a watt.
“If we were to draw energy from a typical AA battery based on this design, it would last for a billion years,” said Lee. “Using the Schottky barrier allows us to keep the electrodes from interfering with each other in order to amplify the amplitude of the signal even at the state where the transistor is almost switched off.”
Sorry Doc, looks like we don’t need your 1.21 gigawatts after all.
Original article here.
When LinkedIn took it upon themselves to dig into its more than 250 million LinkedIn members profiles, it found a number of tech job titles that had come into existence over the last few years.
Wanting an infographic that profiled these hot job titles in an engaging way, LinkedIn hired Visually to design a graphic that covered the unique skills and trending numbers of these job titles. View it here orclick here to see it on LinkedIn’s site.
It was December 2008 and Twitter was barely two years old. At the time, many questioned the point of the 140-character messaging platform. Many still do.
But one doubting Thomas, stood out from the pack. Rather than just backstab the new social media tool in coffee shop conversations,Hans Scharler took action.
In a public demonstration highlighting the futility of Twitter, this American inventor and entrepreneur enabled his toaster to tweet.
You read right.
When Hans Scharler’s toaster is on, the @mytoaster Twitter handle tweets “Toasting.” When it’s done, it tweets “Done Toasting.” A small protest, yes. But a butterfly effect in the internet world. For it may have been this networked kitchen appliance, through it’s metaphorical wing flap, that created the hurricane of what is now called the “Internet of Things.”
And hurricane is no exaggeration. Today, the Internet of Things, or IoT, goes well beyond tweets regarding one’s breakfast status. In 2014, Google bought Nest, a home IoT collection of thermostats, smoke detectors, and other security systems, for an impressive $3.2 billion. That’s BILLION, with a “B.” And this storm is just growing. Research firm IDC believes that the global revenue for IoT will be $1.46 trillion by 2020. At that time it is estimated that somewhere between 25 to 200 billion devices will be connected within the IoT ecosystem.
“But hang on!” your internal monologue is screaming, “If IoT is this huge, why haven’t I heard of it before?”
Well, you have. The Internet of Things is sitting in your garage right now. Our cars, a “thing,” have been connected via the “internet” to other “things” for years now — although you took it all for granted. Jumping into the driver’s seat Monday morning, you overlooked the seamless link between your car stereo and smartphone.
On your journey to a client that same day, chances are you didn’t question the adjusted route based on real-time traffic alerts. And when you were headed home, and the car overheated, there’s a good chance that the requested emergency roadside assistance was triggered with a press of a button, or completely autonomously.
And that coffee you’re drinking? If you are sitting in Italy or Switzerland, then it may have been brought to you by IoT. Solair, an Italian company that was acquired this year by Microsoft for an undisclosed amount, has deployed its IoT cloud-based applications to help the Rancilio Group manage the coffee machines that it sells to hotels, restaurants and cafes. Through IoT, Rancilio manages coffee supplies, undertakes remote maintenance, and helps clients avoid sales losses through machine downtime. One hundred percent “thing”; zero percent human.
So where to next?
Well, globally there’s at least 88 publicly listed companies that are active in the IoT space. These companies include the obvious Cisco, Google, and IBM, but also lesser known firms like PTC, a company known for its design modelling software and product lifecycle management tools. PTC has also been snapping up smaller companies with existing capabilities within the Cloud service space and artificial intelligence (AI).
And it’s this trichotomy of IoT, Cloud and AI that allows for some very exciting products. TakeDRONEBOX, for example. Through IoT sensors, like the precision agriculture NDVI camera, crop harvests can be increased through better water and fertiliser management. And that’s just one application of the autonomous, self-charging drone.
IoT enabled thermal and high definition cameras expand the applications across asset inspection, emergency response, security, and even livestock management. And then there’s the myriad of other IoT sensors and actuators entering the market (see Figure 1, below). Once connected to the Cloud, these sensors open up a huge number of applications, as well a new field known as prescriptive analytics.
n short, prescriptive analytics combine hybrid data — a combination of structured (numbers, categories) and unstructured data (videos, images, sounds, texts) — with business rules to predict the future and to prescribe how to take advantage of this future scenario.
Hans Scharler’s IoT toaster, equipped with this computational power, would not only announce its current cooking status, but order the bread, jam and complementary orange juice in the previous week’s grocery order. It might also suggest a healthier option, based on a conversation it had with your IoT scales.
Original article here.
The malware that powered one of the worst denial of service cyberattacks of the last few years has infected internet-connected devices all over the world, reaching as many as 177 countries, according to security researchers.
At the beginning of this month, a cybercriminal released the source code of the malware that powered one of the worst-ever zombie armies, or botnet, made of Internet of Things. The release of the malware, known as Mirai, gave cybercriminals with minimal skills a new tool to launch cyberattacks. It also gave internet defenders and security researchers a way track down the bad guys’ activities and map their armies of hacked devices.
Imperva, a company that provides protection to websites against Distributed Denial of Service (DDoS) attacks, is among the ones who have been busy investigating Mirai. According to their tally, the botnet made of Mirai-infected devices has reached a total of 164 countries. A pseudonymous researcher that goes by the name MalwareTechhas also been mapping Mirai, and according to his tally, the total is even higher, at 177 countries.
“Most indiscriminately spread malware will show up allover the globe,” MalwareTech said in a Twitter message.
Mirai was used to build a botnet that hit the website of security journalist Brian Krebs with a large DDoS attack last month. A hacker who goes by the name Anna-senpai released the source code of the malware at the beginning of October, but it’s unclear who really is behind it.
Mirai isn’t really a fancy piece of malware, but it’s effective and spreads quickly because it targets Internet of Things (IoT) devices that are extremely easy to hack. These devices, mostly DVRs and surveillance cameras, use default and predictable passwords, such as “admin” and “123456”, “root” and “password,” or “guest” and “guest,” among others.
Thanks to these bad passwords, and the Mirai malware, the Internet of (hackable) Things has truly gone global.
Original article here.
Three years after its purchase of SoftLayer, IBM has finally coalesced its cloud computing strategy around the cloud platform and is winning new IT fans.
As more enterprise IT shops embrace shifting critical workloads to the cloud, Big Blue may finally be flexing its muscles.
Users’ growing acceptance of IBM’s cloud initiative is building, based on an increasing number of deals with larger enterprises and industry observations. A big factor in this momentum is that a handful of the company’s cloud-based technologies now work better in concert with each other. These include its data analytics software, Bluemix cloud application development platform, and even IBM Watson — all of which now operate with SoftLayer cloud platform.
“We have a lot invested in their legacy hardware and software for quite some time,” said one project manager with a large transportation company in northern Florida who requested anonymity. “I like the story from them [about SoftLayer] because it gives us more options to pursue with cloud, like bare-metal, that can improve the performance of some of the data-intensive apps we have.”
IBM came out of the blocks slowly with its first cloud initiatives several years ago and has mostly trailed market leaders Amazon Web Services, Microsoft and Google. But as its legacy hardware and software business continues to sag, the company has gradually focused more of its existing and new cloud applications and tools around SoftLayer — and some believe IBM can offer corporate users a more compelling cloud narrative.
“They are selling more [products and services] on top of SoftLayer, which has allowed them to make progress with a legitimate cloud strategy,” said Carl Brooks, an analyst with 451 Research. “They might be able to put out some of the engine fires they had with that business.”
IBM gaining new cloud users
Earlier this year, Halliburton, a service provider to the oil and gas industry, implemented IBM’s cloud platform to run its reservoir simulation software designed to help the company better understand how complex oil and gas fields might behave given different development scenarios.
Using the CPU, GPU and storage capabilities of IBM Cloud, which were delivered as a service, the company can now run through hundreds of simulations that help it better forecast the potential of complex oil and gas fields, using both bare-metal and virtual servers.
Using both kinds of servers gives Halliburton more flexibility to scale compute power up and down depending on customers’ requirements, as well as switch from a Capex to an Opex model, company officials said. The company also worked with IBM to set up a compute cluster in the IBM Cloud that was then connected to Halliburton’s global network.
“We knew the more computing power we had, the more efficient a job we could do,” said Steven Knabe, Halliburton’s regional director for consulting for Latin America. “With the cluster, we can build and run much more complex simulation models and increase our chances of winning projects [more] than we had before.”
Another new IBM Cloud customer is JFE Steel Co., one of the largest steel makers in the world, which inked a five-year agreement last month to migrate its core legacy systems to the IBM Cloud while also consolidating its infrastructure. The Japan-based company will deploy SoftLayer as its cloud infrastructure, along with IBM’s Cloud Orchestrator to automate cloud services and Control Desk for IT management services.
Driving the company’s decision was a need to establish a more efficient business model, which meant swapping in a more flexible IT infrastructure to more quickly adjust to market changes brought on by rapidly declining steel prices.
“The company realized it had to modernize its IT infrastructure to take advantage of some new business processes and modernize the business,” said Charles King, president and principal analyst with Pund-IT. “But it didn’t want to pay a lot up front to do that, and IBM Cloud gave them a way forward to get all that underway.”
King also believes IBM has done a better job at winning over both existing and new IT shops to its cloud strategy, not just because of SoftLayer, but its investment in its now 48 data centers around the world. These data centers are used not just for hosting but also for joint development of cloud-based software between IBM engineers and local developers.
“You have to give them credit for the considerable amount of money spent in expanding their cloud data centers,” King said. “That sort of global footprint helps when reaching out to new [cloud] customers like JFE.”
User cloud confidence grows
Many corporate users are more confident to move generous chunks of their business anchored on premises to the cloud — and that helps IBM’s cloud initiative as well as those of all its major competitors. Decade-long fears about the lack of reliability and security of mission-critical data and applications in the cloud appear to be melting away.
“The enterprise three years ago was new for us, from a SoftLayer perspective,” said Marc Jones, CTO and IBM Distinguished Engineer for IBM Cloud. Now, though, enterprises are not only more open to actually bringing those workloads to the cloud, but it’s a first-choice destination for their applications and services. “Before, it was a lot of research and, ‘Could I, would I?’ But now, it’s, ‘Let’s go,'” he said.
Some analysts believe the industry is rapidly approaching a tipping point in the widespread adoption of cloud computing to where it becomes the primary way IT shops conduct business.
“Enterprises are now more comfortable with cloud computing with on-demand infrastructure and servers replacing colocations,” 451 Research’s Brooks said. “These [IBM deals] reflect the mainstreaming of cloud computing, more than any particular pizazz on the part of IBM.”
Further indication that cloud adoption is reaching a tipping point is the heightened interest among resellers and business partners of top-tier vendors. In many cases, it’s the channel, not the vendors, directly selling and servicing a range of cloud computing offerings to IT shops.
“Mainstream IT computing is partaking of [Amazon Web Services] and Azure in ways it was not two years ago,” Brooks said, “and they don’t want to do it all themselves. They want the people in the middle [resellers] to do it for them and IBM is picking up on this trend.”
Original article here.
Asked and answered. These real-life pitch questions from Steve Case’s Rise of the Rest tour can help give you an edge on your next pitch.
Pitch competitions are a reality of startup life, as common as coffee mugs that say “Hustle Harder” or thought leaders expounding on the need for “grit.”
Still, even the smartest entrepreneur isn’t always ready for what competition judges might ask. During Steve Case’s Rise of the Rest tour, a seven-city road trip across the U.S. highlighting entrepreneurs outside the major startup hubs, founders in Phoenix participated in their own mock pitch competition, allowing them to practice and polish their answers.
We’ve collected a curated selection of questions during the competition, some asked more often than others.
To prepare for your next competition, get prepping with these potential questions:
1. What’s your top priority in the next six months? What metric are you watching the most closely?
2. What’s your exit strategy?
3. How does your product/service work?
4. Who is your customer?
5. Do you have contracts and if so, how often do they renew?
6. Why is your team the team to bring this to market?
7. You say you’ll have 100 staffers in five years. You have six now. What will those new staffers do?
8. Why is your product/service better than what’s already on the market?
9. Who are your competitors?
10. Do you have a patent?
11. If you win the investment, what would that partnership look like?
12. You’ve secured a strategic partnership. Is that partnership exclusive? And if not, is that a liability?
13. What’s your barrier to capacity?
14. What’s your expansion strategy?
Pricing and revenue
15. How much of your revenue is from upsells? And how do you see that changing over time?
16. Everyone says they can monetize the data they collect. What’s your plan?
17. Can you explain your revenue model?
18. What’s your margin?
19. Are you charging too little?
20. Are you charging too much?
21. How will you get to 1 million users?
22. Is this trend sustainable?
23. What regulatory approvals do you need and how have you progressed so far?
Original article here.
Nearly 140 private companies working to advance artificial intelligence technologies have been acquired since 2011, with over 40 acquisitions taking place in 2016 alone (as of 10/7/2016). Corporate giants like Google, IBM, Yahoo, Intel, Apple and Salesforce, are competing in the race to acquire private AI companies, with Samsung emerging as a new entrant this month with its acquisition of startup Viv Labs, which is developing a Siri-like AI assistant.
In 2013, the corporate giant picked up deep learning and neural network startup DNNresearchfrom the computer science department at the University of Toronto. This acquisition reportedly helped Google make major upgrades to its image search feature. In 2014 Googleacquired British company DeepMind Technologies for some $600M (Google DeepMind’s program recently beat a human world champion in the board game “Go”). This year, it acquired visual search startup Moodstock, and bot platform Api.ai.
Intel and Apple are tied for second place. The former acquired 3 startups this year alone: Itseez, Nervana Systems, and Movidius, while Apple acquired Turi and Tuplejump recently.
Twitter ranks third, with 4 major acquisitions, the most recent being image-processing startupMagic Pony.
Salesforce, which joined the race last year with the acquisition of Tempo AI, has already made two major acquisitions this year: Khosla Ventures-backed MetaMind and open-source machine-learning server PredictionIO.
We updated this timeline on 10/7/2016 to include acquirers who have made atleast 2 acquisitions since 2011.
|Major Acquirers In Artificial Intelligence Since 2011|
|Company||Date of Acquisition||Acquirer|
|Dark Blue Labs||10/23/2014||DeepMind|
|Granata Decision Systems||1/23/2015|
Original article here.
The ACMA’s new management plan will see it focus on incoming network technologies and the spectrum issues behind them, including mmW bands for 5G and spectrum sharing for IoT.
The Australian Communications and Media Authority (ACMA) has released its five-year spectrum outlook (FYSO) and 12-month work plan, with the federal government agency focusing on arrangements to support 5G, the Internet of Things (IoT), and dynamic spectrum access (DSA).
The Five-year spectrum outlook 2016-20: The ACMA’s spectrum management work program[PDF] details the influx of “transformative” technologies needing to be dealt with out to 2020.
For 5G, the ACMA is considering the use of millimetre wave (mmW) bands.
“Enabling the next phase of mobile network development is likely to require the ACMA’s attention in a number of areas,” the FYSO said.
“From a spectrum perspective, 5G appears certain to use (though not exclusively) large contiguous bandwidths (hundreds of MHz or more) in millimetre wave bands.”
The ACMA is monitoring both high-frequency and low-frequency mmW bands, including the 2.3GHz, 2.5GHz, 3.5GHz, and 3.6GHz bands, for 5G mobile services.
“The 2.3GHz, 2.5GHz and 3.5GHz bands are already available for use for mobile broadband services in Australia and could feasibly be used for early deployment of 5G or pre-standard 5G in Australia,” the ACMA added.
“The 3.6GHz band is included in the initial investigation stage of ACMA’s mobile broadband work program.”
The ACMA is planning to publish a discussion paper on planning issues for the 1.5GHz and 3.6GHz spectrum bands over the next few weeks, with the latter band being eyed for 5G purposes worldwide.
For IoT concerns, the ACMA is looking at a broad range of spectrum bands due to the large number of varied uses and users involved.
“Given the huge diversity of uses of IoT, there is no simple solution to providing spectrum for all of the applications which are likely to require access to it under a range of protocols from dedicated spectrum to commons spectrum, and options in between,” ACMA acting chairman Richard Bean said at the CommsDay Congress in Melbourne this week.
“We are and have taken steps to make new spectrum available to support a range of low-power applications including M2M [machine-to-machine] applications in 900MHz band as part of the implementation of our review of the 803-960 band. Permanent arrangements in this band are not currently set to be in place until 2021, but we will consider early access applications.”
The ACMA is also examining IoT opportunities in the very high frequency (VHF) band.
The ACMA had previously argued in favour of a default spectrum band for all IoT devices across the globe, or, alternatively, sensors that can identify which country a device is operating in.
The government agency in December released a set of proposed changes to spectrum regulations aimed at providing easier access for M2M operators utilising spectrum for IoT, and outgoing ACMA chairman Chris Chapman in February emphasised the need for IoT spectrum.
In regards to DSA, the ACMA recognised spectrum sharing as being “fundamental” for efficient spectrum management. DSA relies on users and uses to co-exist on the same spectrum band, with awareness of the environment required.
The ACMA said there are currently three ways for devices to become more aware of their surroundings to enable dynamic sharing of a spectrum band.
“At this stage, three major techniques to enhance a device’s awareness of its surroundings have been identified: Geolocation with database look-up; sensing; and beacon transmissions,” the FYSO says.
“These techniques can be used to make use of spectrum ‘white space’, where secondary users take advantage of intermittent, occasional or itinerant use by primary users.”
The ACMA in July said spectrum sharing is the key to IoT and 5G, with ACMA Spectrum Planning Branch executive manager Christopher Hose saying that there needs to be more cooperation between industry and the ACMA to achieve this goal.
The agency said it recently implemented DSA across the 3400MHz-3600MHz spectrum band between Defence radar systems and terrestrial wireless broadband.
Lastly, the ACMA’s new 12-month work plan will see the agency focus on 10 projects across three “themes”. The first theme will see the ACMA implement its mobile broadband strategyby Q4 2016; work on priority compliance areas including interference management, customer cabling compliance, and transmitter licensing compliance across the 400MHz and by June 2017; set spectrum pricing initiatives by Q1 2017; look into spectrum allocations in the 700MHz, 850MHz, 1800MHz, 1.5GHz, 2GHz, 2.3GHz, 3.4GHz, and 3.6GHz bands; implement the regional digital radio rollout plan, with scoping to be completed by Q4 2016 and implementation from Q1 2017; and convert AM to FM commercial radio broadcasting services in selected regional licence areas between Q4 2016 and Q1 2017.
The second theme will see the ACMA develop its customer self-service program, with device registration and 900MHz station registration online forms made available in Q3 2016; XML payload form for APs to be made available in Q4 2016; and apparatus licence application forms to be available in Q1 2017.
The final theme will involve implementation of the government’s spectrum review in accordance with the Department of Communications’ timeline; implementation of the 400MHz spectrum band review, with the second milestone due between December 2016 and June 2017 and the third milestone in December 2017; and updating the Australian Radiofrequency Spectrum Plan in January 2017.
The ACMA is inviting comment on 5G mmW bands, IoT spectrum, DSA, and its approach to the new 12-month work plan in response to the FYSO.
Earlier on Friday, the Department of Communications announced that the ACMA would also be auctioning off 2x 15MHz of the 700MHz spectrum band that went unsold during the 2013digital dividend auction, following Vodafone Australia’s proposal to buy the spectrum outright.
Original article here.
When it comes to implementing a cloud infrastructure, whether it’s public, private, or hybrid, most IT departments view the technology as a way to cut costs and save money, according to a recent analysis from CompTIA. The report also shows that SaaS is seen as the most useful cloud service.
When it comes to cloud computing adoption, money is still the main motivator.
In fact, many large IT departments view the cloud, whether it’s public, private, or a hybrid combination, as a way to cut costs and save money, according to a new analysis from CompTIA.
The study found that 47% of large enterprises, 44% of medium businesses, and 41% of small firms surveyed reported that slashing costs outweighed other factors such as speed, modernization, and reducing complexity.
The report, “Trends in Cloud Computing,” is based on the online responses from 500 business and IT executives conducted during July. The study also quotes Gartner numbers that forecast the public cloud will generate $204 billion in worldwide revenue in 2016, which is a 16.5% increase over last year’s figure.
Overall, the Sept. 27 CompTIA report finds a robust, if somewhat maturing market for all different types of cloud structures and services. The study finds than nine out of ten companies are using cloud computing in at least one way. Additionally, 71% of respondents reported that their business is using cloud either in production or at least for non-critical parts of the company.
This, the report finds, is a sign of maturity:
The familiarity with technical details has grown, and while business opportunities may still flourish around models that are mislabeled as cloud, the market is growing more savvy. End users from both the IT function and business units are growing more aware of the tools they are using and how those tools compare to other options that are available.
While the cloud is popular, its ability to trim costs in the face of mounting pressure on budgets and bottom lines is what’s driving the adoption of more of the technology. In an email exchange with InformationWeek, Seth Robinson, senior director of technology analysis for CompTIA, writes that IT sees cloud as the ultimate way to do more with less.
“Cost savings are important for IT as they rethink what a modern architecture looks like, but that is only one factor for building a new IT approach,” Robinson wrote.
No matter what the motive, Robison writes that IT needs to view cloud as a way to deliver value, not only to the tech department, but to the whole business as well. This is a major concern as IT is asked to deliver solutions across the enterprise in many different sections.
“Cloud offerings can deliver cost efficiency, but they can also simplify workflow, speed up operations, introduce new features, or lead to new business products/services,” Robinson wrote. “The role of the IT team is not to simply implement a cloud component to perform a discrete function, but to drive business objectives forward by utilizing the right mix of cloud solutions.”
For those surveyed in the CompTIA study, private cloud remains the primary option, although that is expected to change soon. The report found that 46% of respondents are using private cloud, compared to 28% using public cloud, and 26% working with a hybrid option. Those numbers will shift as companies get more familiar with what is a true cloud platform and what is not, according to Robinson.
“Private cloud usage is probably still somewhat exaggerated even as companies are becoming more precise in their terminology,” Robison writes. “Long-term, we expect that companies will migrate towards a hybrid or multi-cloud model, utilizing public cloud, private cloud, and [on-premises] resources.”
When it comes to the different types of cloud technologies that businesses are using, Software-as-a-Service (SaaS), is the most popular — a finding that supports other recent studies that show IT departments are increasingly adopting SaaS. It also highlights the popularity of companies such asSalesforce, and also Microsoft, which uses a SaaS model to push out versions of its Office 365 suite, as well as Windows 10.
“SaaS options give IT more flexibility across many areas when trying to build an overall technology environment,” Robison writes. “There are benefits to be had by replacing individual applications with SaaS, but greater benefits to understanding how multiple SaaS applications work together to enable operations.”
The most popular cloud-based application is email, which 51% respondents reported using. Other top-ranking applications include:
- Business productivity suites — Office 365 and Google Apps — at 45%
- Web presence at 46%
- Collaboration at 39%
- CRM at 37%
- Financial management at 32%
- VoIP at 31%
- Virtual desktop at 30%
Another finding of the report suggests that for most applications, there was a dramatic drop in the number of companies that say they are using a cloud solution from 2014 to 2016, along with corresponding jumps in the number of companies reporting use of on-premises systems. However, the fluctuations could be attributed to the fact that what IT and businesses call “cloud” has changed as the technology has matured.
“In the early days of cloud, employees likely assumed that any [off-premises] application was cloud-based (or may have even assumed the use of SaaS applications without considering where software was hosted),” the report stated. “With a greater appreciation for cloud-specific characteristics, employees are honing their assessment.”
Original article here.
A huge online attack enabled by Internet-connected devices illuminates a problem keeping security experts awake at night.
When the website of security expert Brian Krebs recently went down, it wasn’t bad luck—it was the result of a huge surge of data: 620 gigabits per second. And now we know where it came from. It was an army of Internet-connected devices, being used as slaves to take down servers.
According to the Wall Street Journal, as many as one million security cameras, digital video recorders, and other connected devices have been employed by hackers to carry out a series of such attacks. When corralled together, these pieces of hardware can be used as a so-called botnet, collectively sending data and Web page requests to servers with such ferocity that they’re overwhelmed and ultimately crash.
It’s a powerful new way of putting an old idea into practice. Attackers have long installed malware on PCs to have them act as bots that they control, and more recently home routers and printers have been used to the same ends. But as Internet-connected devices proliferate in our homes and offices, the potential number of devices to draw upon is increasing dramatically.
The scale of the new set of attacks is unprecedented. According to the BBC, this recent spate has been able to barrage servers with data at rates of over a terabit per second. In addition to Krebs’s site, the targets have included the servers of French Web hosting provider OVH. The attacksmay have been carried out by the same botnet.
The news raises fresh concerns about the security of Internet of things devices. Purpose-built to be controlled over the Internet, such devices have been billed as the future of sensing and control to businesses and domestic users alike—from connected video cameras and speakers to smart thermostats and lightbulbs. While initially slow to gain popularity, they are proliferating as they’ve become increasingly user-friendly.
But there’s a problem. Many such devices are purchased, installed, and then used without much further attention being paid to their configuration. That means that they may never be updated, leaving huge scope for their exploitation by hackers if they contain a security flaw. (They invariably do.) Who, after all, bothers to update a lightbulb?
Earlier this year, the National Security Agency’s hacking chief, Rob Joyce, sounded caution over these kinds of devices. Their security is “something that keeps me up at night,” he said at the time.
His concern is understandable. Back in 2013, security researcher HD Moore set about interrogating the entire Internet from a stack of computers at his home. He found thousands of industrial and business devices that were insecure and vulnerable to attack. By now, that number could be much higher.
While it’s unfortunate for Brian Krebs and OVH that their servers were taken down, no great harm has been done. But when industrial devices become a part of these attacks, there may be more to fear.
Original article here.
For the IT sector, the concept of digital transformation represents a time for evolution, revolution and opportunity, according to Information Technology Association of Canada (ITAC) president Robert Watson.
The new president for the technology association made the statements at last week’s IDC Directions and CanadianCIO Symposium in Toronto. The tech trends event was co-hosted by ITWC and IDC with support from ITAC.
Notable sessions included the ITWC-moderated Digital Transformation panel — which featured veteran CIOs discussing the digital transformation opportunities and challenges— and IDC Canada’s Nigel Wallis outlining why Canadian business models should shift to reap IoT rewards.
Digital transformation refers to the changes associated with the application of digital technology in all aspects of human society; the overarching event theme focused on digital transformation as more than mere buzzword, but as process that tech leaders and organizations should already be adopting. Considering the IT department is the “substance of every industry,” it follows that information technology can play a key role in setting the pace for innovation and future developments, offered Watson.
Both the public and private sectors are looking to diversify operations and economies — the IT sector will be the leaders and enable development of emerging technologies including the Internet of Things (IoT): “It is coming for sure and a fantastic opportunity.”
With that in mind, here are four key takeaways from the event.
“Have you ever seen a more dynamic, exciting, and scary time in our industry?”
IDC’s senior vice president and chief analyst Frank Gens outlined reasons why IT is currently entering an “innovation stage” with the era of the Third Platform, which refers to emerging tech such as cloud technology, mobile, social and IoT.
According to IDC, the Third Platform is anticipated to grow to approximately 20 per cent by the year 2020; eighty per cent of Fortune 100 companies are expected to have digital transformation teams in place by the end of this year.
“It’s about a new foundation for enterprise services. You can connect back-end AI to this growing edge of IoT…you are really talking about collective learning and accelerated learning around the next foundation of enterprise solutions,” said Gens.
Takeaway: In a cloud- and mobile-dominated IT world, enterprises should look to quickly develop platform- and API-based services across their network, noted Gens, while also looking to grow the developer base to use those services.
“Robotics is an extremely vertical driven solution.”
Think of that classic 1927 film Metropolis, and its anthropomorphic robot Maria: While IT has come a long way from Metropolis in terms of developments in robotics, the industry isn’t quite there yet. But we’re close, noted IDC research analyst Vlad Mukherjee, and the industry should look at current advancements in the field.
According to Mukherjee, robotics are driving digital transformation processes by establishing new revenue streams and changing the way we work.
Currently, robotics tech is classified in terms of commercial service, industry and consumer. Canadian firms in total are currently spending $1.08 B on the technology, Mukherjee said.
Early adopters are looking at reducing costs; this includes the automotive and manufacturing sectors, but also fields such as healthcare, logistics, and resource extraction. In the case of commercial service robotics, the concept works and the business case is there, but not at the point where we can truly take advantage, he said.
The biggest expense for robotics is service, maintenance, and battery life, said Mukherjee.
Takeaway: Industrial robots are evolving to become more flexible, easier to setup, support more human interaction and be more autonomously mobile. Enterprises should keep abreast of robotics developments, particularly the rise of collaborative industrial robots which have a lower barrier for SME adoption. This includes considering pilot programs and use cases that explore how the technology can help improve operations and automated processes.
“China has innovated significantly in terms of business models that the West has yet to emulate.”
Analysts Bryan Ma, vice-president of client devices for IDC Asia-Pacific, and Krista Collins, research manager of mobility and consumer for IDC Canada, outlined mobility trends and why the mobility and augmented or virtual reality markets seen in the east will inevitably make their way to Canada.
China is no longer considered a land of cheap knockoffs, said Ma, adding consider the rise of companies like Xiaomi, considered “The Apple of China.”
Globally, shipments of virtual reality (VR) hardware are expected to skyrocket this year, according to IDC’s forecasts. It expects shipments to hit 9.6 million units worldwide, generating $2.3 billion mostly for the four lead manufacturers: Samsung, Sony, HTC, and Oculus.
With VR in its early days, both Ma and Collins see the most growth potential for the emerging medium coming from the consumer market. Gaming and even adult entertainment options promise to be the first use-cases for mass adoption, with applications in the hospitality, real estate, or travel sectors coming later.
“That will be bigger on the consumer side of the market,” Collins said. “That’s what we’ll see first here in Canada and in other parts of the world.”
Takeaway: Augmented reality (AR) headsets will take longer to ramp up, IDC expects. In 2016, less than half a million units will ship. That will quickly climb to 45.6 million units by 2020, chasing the almost 65 million expected shipments of VR headsets. But unlike VR, the first applications for AR will be in the business world.
“Technology is integrated with everything”
There are currently more than 3.8 billion mobile phones on the planet — just think of the opportunities, offered David Senf, vice president of infrastructure solutions for IDC Canada.
He argued that digital transformation is an even bigger consideration than security — and responding to business concerns is a top concern for IT in 2016. IT staff spent two weeks more “keeping the lights on” in 2015 versus being focused on new, innovative projects. This has to change, said Senf.
IT is living in an era of big data and advanced analytics. As cloud technology matures — from just being software-as-a-service (SaaS) to platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS) — CIOs should think about the cloud in a new way. Instead of just the cloud, it’s a vital architecture that should be supporting the business.
“Organizations are starting to define what that architecture looks like,” said Senf, adding the successful ones understand that the cloud is a competitive driver, particularly from an identity management, cost, and data residency perspective.
Takeaway: If the infrastructure isn’t already ready for big data, it might already be behind the curve. Senf notes CIOs should ensure that the IT department is able to scale quickly for change — and is ready to support the growing demands of the business side, including mobility public cloud access.
Get ready to experiment and become comfortable with data sources and analysis. This includes looking at the nature of probabilistic findings — and using PaaS, he added.
Read more: http://www.itworldcanada.com/article/the-future-of-it-four-points-on-why-digital-transformation-is-a-big-deal/383121#ixzz4MFnnfcAh
or visit http://www.itworldcanada.com for more Canadian IT News
When we started this decade, the Internet of Things was a basically a buzzword, talked about by a few, acted upon by fewer, a challenge to save for the future, like 2015 or 2020.
But as a famous character once said in a movie that’s now 30 years old, “life moves pretty fast…” and now, here we are with 2015 in the rear view mirror and our 2020 vision becoming clearer by the minute.
Everyone’s talking about the Internet of Things, even the “things,” which can now request and deliver customer support, tell if you’re being as productive as you could be at work, let your doctor know if you’re following orders (or not), reduce inefficiencies in energy consumption, improve business processes, predict issues and proactively improve or resolve them based on data received.
The Internet of Things (IoT) is just getting started. These forecasts below show why organizations need to get started too (if they haven’t already) on leveraging and responding to the Internet of Things:
1. The worldwide Internet of Things market is predicted to grow to $1.7 trillion by 2020, marking a compound annual growth rate of 16.9%. – IDC Worldwide Internet of Things Forecast, 2015 – 2020.
2. An estimated 25 billion connected “things” will be in use by 2020. – Gartner Newsroom
3. Wearable technology vendors shipped 78.1 million wearable devices in 2015, an increase of 171.6% from 2014. Shipment predictions for this year are 111 million, increasing to 215 million in 2019. – IDC Worldwide Quarterly Wearable Device Tracker
4. By 2020, each person is likely to have an average of 5.1 connected devices. – Frost and Sullivan Power Management in IoT and Connected Devices
5. In a 2016 PwC survey of 1,000 U.S. consumers, 45% say they now own a fitness band, 27% a smartwatch, and 12% smart clothing. 57% say they are excited about the future of wearable technology as part of everyday life. 80% say wearable devices make them more efficient at home, 78% more efficient at work. – PwC The Wearable Life 2.0: Connected Living in a Wearable World
6. By 2020, more than half of major new business processes and systems will incorporate some element, large or small, of the Internet of Things. – Gartner Predicts 2016: Unexpected Implications Arising from the Internet of Things
7. 65% of approximately 1,000 global business executives surveyed say they agree organizations that leverage the internet of things will have a significant advantage; 19% however, still say they have never heard of the Internet of Things. – Internet of Things Institute 2016 I0T Trends Survey
8. 80% of retailers worldwide say they agree that the Internet of Things will drastically change the way companies do business in the next three years. – Retail Systems Research: The Internet of Things in Retail: Great Expectations
9. By 2018, six billion things will have the ability to request support. – Gartner Predicts 2016: CRM Customer Service and Support
10. By 2020, 47% of devices will have the necessary intelligence to request support. – Gartner Predicts 2016: CRM Customer Service and Support
11. By 2025, the Internet of Things could generate more than $11 trillion a year in economic value through improvements in energy efficiency, public transit, operations management, smart customer relationship management and more. – McKinsey Global Institute Report: The Internet of Things: Mapping the value behind the Hype
12. Barcelona estimates that IoT systems have helped the city save $58 million a year from connected water management and $37 million a year via smart street lighting alone. – Harvard University Report
13. General Electric estimates that the “Industrial Internet” market (connected industrial machinery) will add $10 to $15 trillion to the global GDP within the next 20 years. – GE Reports
14. General Electric believes that using connected industrial machinery to make oil and gas exploration and development just 1% more efficient would result in a savings of $90 billion. – GE Reports
16. Connected homes will be a major part of the Internet of Things. By 2019, companies will ship 1.9 billion connected home devices, marking an estimated $490 billion in revenue (Business Insider Intelligence). By 2020, even the connected kitchen will contribute at least 15 percent savings in the food and beverage industry, leveraging data analytics. – Gartner Research Predicts 2015: The Internet of Things
The Internet of Things is accelerating the transformation of the way we live and work. Life move pretty fast. Stop and look around, but don’t miss it. Is your organization leveraging the Internet of Things?
Original article here.
How do you create a movie trailer about an artificially enhanced human?
You turn to the real thing – artificial intelligence.
20th Century Fox has partnered with IBM Research to develop the first-ever “cognitive movie trailer” for its upcoming suspense/horror film, “Morgan”. Fox wanted to explore using artificial intelligence (AI) to create a horror movie trailer that would keep audiences on the edge of their seats.
Movies, especially horror movies, are incredibly subjective. Think about the scariest movie you know (for me, it’s the 1976 movie, “The Omen”). I can almost guarantee that if you ask the person next to you, they’ll have a different answer. There are patterns and types of emotions in horror movies that resonate differently with each viewer, and the intricacies and interrelation of these are what an AI system would have to identify and understand in order to create a compelling movie trailer. Our team was faced with the challenge of not only teaching a system to understand, “what is scary”, but then to create a trailer that would be considered “frightening and suspenseful” by a majority of viewers.
As with any AI system, the first step was training it to understand a subject area. Using machine learning techniques and experimental Watson APIs, our Research team trained a system on the trailers of 100 horror movies by segmenting out each scene from the trailers. Once each trailer was segmented into “moments”, the system completed the following;
1) A visual analysis and identification of the people, objects and scenery. Each scene was tagged with an emotion from a broad bank of 24 different emotions and labels from across 22,000 scene categories, such as eerie, frightening and loving;
2) An audio analysis of the ambient sounds (such as the character’s tone of voice and the musical score), to understand the sentiments associated with each of those scenes;
3) An analysis of each scene’s composition (such the location of the shot, the image framing and the lighting), to categorize the types of locations and shots that traditionally make up suspense/horror movie trailers.
The analysis was performed on each area separately and in combination with each other using statistical approaches. The system now “understands” the types of scenes that categorically fit into the structure of a suspense/horror movie trailer.
Then, it was time for the real test. We fed the system the full-length feature film, “Morgan”. After the system “watched” the movie, it identified 10 moments that would be the best candidates for a trailer. In this case, these happened to reflect tender or suspenseful moments. If we were working with a different movie, perhaps “The Omen”, it might have selected different types of scenes. If we were working with a comedy, it would have a different set of parameters to select different types of moments.
It’s important to note that there is no “ground truth” with creative projects like this one. Neither our team, or the Fox team, knew exactly what we were looking for before we started the process. Based on our training and testing of the system, we knew that tender and suspenseful scenes would be short-listed, but we didn’t know which ones the system would pick to create a complete trailer. As most creative projects go, we thought, “we’ll know it when we see it.”
Our system could select the moments, but it’s not an editor. We partnered with a resident IBM filmmaker to arrange and edit each of the moments together into a comprehensive trailer. You’ll see his expertise in the addition of black title cards, the musical overlay and the order of moments in the trailer.
Not surprisingly, our system chose some moments in the movie that were not included in other “Morgan”trailers. The system allowed us to look at moments in the movie in different ways –moments that might not have traditionally made the cut, were now short-listed as candidates. On the other hand, when we reviewed all the scenes that our system selected, one didn’t seem to fit with the bigger story we were trying to tell –so we decided not to use it. Even Watson sometimes ends up with footage on the cutting room floor!
Traditionally, creating a movie trailer is a labor-intensive, completely manual process. Teams have to sort through hours of footage and manually select each and every potential candidate moment. This process is expensive and time consuming –taking anywhere between 10 and 30 days to complete.
From a 90-minute movie, our system provided our filmmaker a total of six minutes of footage. From the moment our system watched “Morgan” for the first time, to the moment our filmmaker finished the final editing, the entire process took about 24 hours.
Reducing the time of a process from weeks to hours –that is the true power of AI.
The combination of machine intelligence and human expertise is a powerful one. This research investigation is simply the first of many into what we hope will be a promising area of machine and human creativity. We don’t have the only solution for this challenge, but we’re excited about pushing the possibilities of how AI can augment the expertise and creativity of individuals.
AI is being put to work across a variety of industries; helping scientists discover promising treatment pathways to fight diseases or helping law experts discover connections between cases. Film making is just one more example of how cognitive computing systems can help people make new discoveries.
Original article here.
Cambridge, U.K.-based startup Kaleao Ltd. is entering the hyper-converged systems market today with a platform based on the ARM chip architecture that it claims can achieve unparalleled scalability and performance at a fraction of the cost of competing systems.
The KMAX platform features an integrated OpenStack cloud environment and miniature hypervisors that dynamically defines physical computing resources and assigns them directly to virtual machines and applications. These “microvisors,” as Kaleao calls them, dynamically orchestrate global pools of software-defined and hardware-accelerated resources with much lower overhead than that of typical hypervisors. Users can still run the KVM hypervisor if they want.
The use of the ARM 64-bit processor distinguishes Kaleao from the pack of other hyper-converged vendors such as VMware Inc., Nutanix Inc. and SimpliVity Inc., which use Intel chips. ARM is a reduced instruction set computing-based architecture that is commonly used in mobile devices because of its low power consumption.
“We went with ARM because the ecosystem allows for more differentiation and it’s a more open platform,” said Giovanbattista Mattiussi, principal marketing manager at Kaleao. “It enabled us to rethink the architecture itself.”
One big limitation of ARM is that it’s unable to support the Windows operating system or VMware vSphere virtualization manager. Instead, Kaleao is bundling Ubuntu Linux and OpenStack, figuring those are the preferred choices for cloud service providers and enterprises that are building private clouds. Users can also install any other Linux distribution.
Kaleao said the low-overhead of its microvisors, combined with the performance of RAM processors, enables it to deliver 10 times the performance of competing systems at least than one-third of the energy consumption. Users can run four to six times as many microvisors as hypervisors, Mattiussi said. “It’s like the VM is running on the hardware with no software layers in between,” he said. “We can pick up a piece of the CPU here, a piece of storage there. It’s like having a bare-bones server running under the hypervisor.”
The platform provides up to 1,536 CPU cores, 370 TB of all-flash storage with 960 gigabytes per second of networking in a 3u rack. Energy usage is less than 15 watts per eight-core server. “Scalability is easy,” Mattiussi said. “You just need to add pieces of hardware.”
KMAX will be available in January in server and appliance versions. The company hasn’t released pricing but said its cost structure enables prices in the range of $600 to $700 per server, or about $10,000 for a 16-server blade. It plans to sell direct and through distributors. The company has opened a U.S. office in Charlotte, NC and has European outposts in Italy, Greece and France.
Co-founders Giampietro Tecchiolli and John Goodacre have a long track record of work in hardware and chip design, and both are active in the Euroserver green computing project. Goodacre continues to serve as director of technology and systems at ARM Holdings plc, which make the ARM processor.
Kaleao has raised €3 million and said it’s finalizing a second round of €5 million.
Original article here.