Posted In:tools Archives - AppFerret

standard

Tools I Wish I Had Known About When I Started Coding

2018-03-15 - By 

In the tech world, there are thousands of tools that people will tell you to use. How are you supposed to know where to start?

As somebody who started coding relatively recently, this downpour of information was too much to sift through. I found myself installing extensions that did not really help me in my development cycle, and often even got in the way of it.

I am by no means an expert, but over time I have compiled a list of tools that have proven extremely useful to me. If you are just starting to learn how to program, this will hopefully offer you some guidance. If you are a seasoned developer, hopefully you will still learn something new.

I am going to break this article up into Chrome Extensions and VS Code extensions. I know there are other browsers and other text editors, but I am willing to bet most of the tools are also available for your platform of choice, so let’s not start a religious argument over our personal preferences.

Feel free to jump around.

Chrome Extensions

Now that I am a self-proclaimed web developer, I practically live in my Chrome console. Below are some tools that allow me to spend less time there:

  • WhatFont — The name says it all. This is an easy way of finding out the fonts that your favorite website is using, so that you can borrow them for your own projects.
  • Pesticide — Useful for seeing the outlines of your <div>s and modifying CSS. This was a lifesaver when I was trying to learn my way around the box-model.
  • Colorzilla — Useful for copying exact colors off of a website. This copies a color straight to your clipboard so you don’t spend forever trying to get the right RGBA combination.
  • CSS Peeper — Useful for looking at colors and assets used on a website. A good exercise, especially when starting out, is cloning out websites that you think look cool. This gives you a peek under the hood at their color scheme and allows you to see what other assets exist on their page.
  • Wappalyzer — Useful for seeing the technologies being used on a website. Ever wonder what kind of framework a website is using or what service it is hosted on? Look no further.
  • React Dev Tools — Useful for debugging your React applications. It bears mentioning that this is only useful if you are programming a React application.
  • Redux Dev Tools — Useful for debugging applications using Redux. It bears mentioning that this is only useful if you are implementing Redux in your application.
  • JSON Formatter — Useful for making JSON look cleaner in the browser. Have you ever stared an ugly JSON blob in the face, trying to figure out how deeply nested the information you want is? Well this makes it so that it only takes 2 hours instead of 3.
  • Vimeo Repeat and Speed — Useful for speeding up Vimeo videos. If you watch video tutorials like most web developers, you know how handy it is to consume them at 1.25 times the regular playback speed. There are also versions for YouTube.

VS Code Extensions

Visual Studio Code is my editor of choice.

People love their text editors, and I am no exception. However, I’m willing to bet most of these extensions work for whatever editor you are using as well. Check out my favorite extensions:

  • Auto Rename Tag — Auto rename paired HTML tags. You created a <p>tag. Now you want to change it, as well as its enclosing </p> tag to something else. Simply change one and the other will follow. Theoretically improves your productivity by a factor of 2.
  • HTML CSS Support — CSS support for HTML documents. This is useful for getting some neat syntax highlighting and code suggestions so that CSS only makes you want to quit coding a couple of times a day.
  • HTML Snippets — Useful code snippets. Another nice time saver. Pair this with Emmet and you barely ever have to type real HTML again.
  • Babel ES6/ES7 — Adds JavaScript Babel syntax coloring. If you are using Babel, this will make it much easier to differentiate what is going on in your code. This is neat if you like to play with modern features of JavaScript.
  • Bracket Pair Colorizer — Adds colors to brackets for easier block visualization. This is handy for those all-too-common bugs where you didn’t close your brackets or parentheses accurately.
  • ESLint — Integrates ESLint into Visual Studio Code. This is handy for getting hints about bugs as you are writing your code and, depending on your configuration, it can help enforce good coding style.
  • Guides — Adds extra guide lines to code. This is another visual cue to make sure that you are closing your brackets correctly. If you can’t tell, I’m a very visual person.
  • JavaScript Console Utils — Makes for easier console logging. If you are like most developers, you will find yourself logging to the console in your debugging flow (I know that we are supposed to use the debugger). This utility makes it easy to create useful console.log() statements.
  • Code Spell Checker — Spelling checker that accounts for camelCase. Another common source of bugs is fat-thumbing a variable or function name. This spell checker will look for uncommon words and is good about accounting for the way we write things in JavaScript.
  • Git Lens — Makes it easier to see when, and by whom, changes were made. This is nice for blaming the appropriate person when code gets broken, since it is absolutely never your fault.
  • Path Intellisense — File path autocompletion. This is super handy for importing things from other files. It makes navigating your file tree a breeze.
  • Prettier — Automatic code formatter. Forget about the days where you had to manually indent your code and make things human-legible. Prettier will do this for you much faster, and better, than you ever could on your own. I can’t recommend this one enough.
  • VSCode-Icons — Adds icons to the file tree. If looking at your file structure hurts your eyes, this might help. There is a helpful icon for just about any kind of file you are making which will make it easier to distinguish what you are looking at.

In Conclusion

You likely have your own set of tools that are indispensable to your development cycle. Hopefully some of the tools I mentioned above can make your workflow more efficient.

Do not fall into the trap, however, of installing every tool you run across before learning to use the ones you already have, as this can be a huge time-sink.

I encourage you to leave your favorite tools in the comments below here, so that we can all learn together.

If you liked this article please give it some claps and check out other articles I’ve written hereherehere, and here. Also, give me a follow on Twitter.

Original article here.

 

 

 

 


standard

Great R packages for data import, wrangling and visualization

2017-06-23 - By 

One of the great things about R is the thousands of packages users have written to solve specific problems in various disciplines — analyzing everything from weather or financial data to the human genome — not to mention analyzing computer security-breach data.

Some tasks are common to almost all users, though, regardless of subject area: data import, data wrangling and data visualization. The table below show my favorite go-to packages for one of these three tasks (plus a few miscellaneous ones tossed in). The package names in the table are clickable if you want more information. To find out more about a package once you’ve installed it, type help(package = "packagename") in your R console (of course substituting the actual package name ).

See original article and interactive table here.

 


standard

New Leader, Trends, and Surprises in Analytics, Data Science, Machine Learning Software Poll

2017-05-24 - By 

Python caught up with R and (barely) overtook it; Deep Learning usage surges to 32%; RapidMiner remains top general Data Science platform; Five languages of Data Science.

The 18th annual KDnuggets Software Poll again got huge participation from analytics and data science community and vendors, attracting about 2,900 voters, almost exactly the same as last year. Here is the initial analysis, with more detailed results to be posted later.

Python, whose share has been growing faster than R for the last several years, has finally caught up with R, and (barely) overtook it, with 52.6% share vs 52.1% for R.

The biggest surprise is probably the phenomenal share of Deep Learning tools, now used by 32% of all respondents, while only 18% used DL in 2016 and 9% in 2015. Google Tensorflow rapidly became the leading Deep Learning platform with 20.2% share, up from only 6.8% in 2016 poll, and entered the top 10 tools.

While in 2014 I wrote about Four main languages for Analytics, Data Mining, Data Science being R, Python, SQL, and SAS, the 5 main languages of Data Science in 2017 appear to be Python, R, SQL, Spark, and Tensorflow.

RapidMiner remains the most popular general platform for data mining/data science, with about 33% share, almost exactly the same as in 2016.

We note that many vendors have encouraged their users to vote, but all vendors had equal chances, so this does not violate KDnuggets guidelines. We have not seen any bot voting or direct links to vote for only one tool this year.

Spark grew to about 23% and kept its place in top 10 ahead of Hadoop.

Besides TensorFlow, another new tool in the top tier is Anaconda, with 22% share.

Top Analytics/Data Science Tools

Fig 1: KDnuggets Analytics/Data Science 2017 Software Poll: top tools in 2017, and their share in the 2015-6 polls

See original full article here.


standard

10 new AWS cloud services you never expected

2017-01-27 - By 

From data scooping to facial recognition, Amazon’s latest additions give devs new, wide-ranging powers in the cloud

In the beginning, life in the cloud was simple. Type in your credit card number and—voilà—you had root on a machine you didn’t have to unpack, plug in, or bolt into a rack.

That has changed drastically. The cloud has grown so complex and multifunctional that it’s hard to jam all the activity into one word, even a word as protean and unstructured as “cloud.” There are still root logins on machines to rent, but there are also services for slicing, dicing, and storing your data. Programmers don’t need to write and install as much as subscribe and configure.

Here, Amazon has led the way. That’s not to say there isn’t competition. Microsoft, Google, IBM, Rackspace, and Joyent are all churning out brilliant solutions and clever software packages for the cloud, but no company has done more to create feature-rich bundles of services for the cloud than Amazon. Now Amazon Web Services is zooming ahead with a collection of new products that blow apart the idea of the cloud as a blank slate. With the latest round of tools for AWS, the cloud is that much closer to becoming a concierge waiting for you to wave your hand and give it simple instructions.

Here are 10 new services that show how Amazon is redefining what computing in the cloud can be.

Glue

Anyone who has done much data science knows it’s often more challenging to collect data than it is to perform analysis. Gathering data and putting it into a standard data format is often more than 90 percent of the job.

Glue is a new collection of Python scripts that automatically crawls your data sources to collect data, apply any necessary transforms, and stick it in Amazon’s cloud. It reaches into your data sources, snagging data using all the standard acronyms, like JSON, CSV, and JDBC. Once it grabs the data, it can analyze the schema and make suggestions.

The Python layer is interesting because you can use it without writing or understanding Python—although it certainly helps if you want to customize what’s going on. Glue will run these jobs as needed to keep all the data flowing. It won’t think for you, but it will juggle many of the details, leaving you to think about the big picture.

FPGA

Field Programmable Gate Arrays have long been a secret weapon of hardware designers. Anyone who needs a special chip can build one out of software. There’s no need to build custom masks or fret over fitting all the transistors into the smallest amount of silicon. An FPGA takes your software description of how the transistors should work and rewires itself to act like a real chip.

Amazon’s new AWS EC2 F1 brings the power of FGPA to the cloud. If you have highly structured and repetitive computing to do, an EC2 F1 instance is for you. With EC2 F1, you can create a software description of a hypothetical chip and compile it down to a tiny number of gates that will compute the answer in the shortest amount of time. The only thing faster is etching the transistors in real silicon.

Who might need this? Bitcoin miners compute the same cryptographically secure hash function a bazillion times each day, which is why many bitcoin miners use FPGAs to speed up the search. Anyone with a similar compact, repetitive algorithm you can write into silicon, the FPGA instance lets you rent machines to do it now. The biggest winners are those who need to run calculations that don’t map easily onto standard instruction sets—for example, when you’re dealing with bit-level functions and other nonstandard, nonarithmetic calculations. If you’re simply adding a column of numbers, the standard instances are better for you. But for some, EC2 with FGPA might be a big win.

Blox

As Docker eats its way into the stack, Amazon is trying to make it easier for anyone to run Docker instances anywhere, anytime. Blox is designed to juggle the clusters of instances so that the optimum number are running—no more, no less.

Blox is event driven, so it’s a bit simpler to write the logic. You don’t need to constantly poll the machines to see what they’re running. They all report back, so the right number can run. Blox is also open source, which makes it easier to reuse Blox outside of the Amazon cloud, if you should need to do so.

X-Ray

Monitoring the efficiency and load of your instances used to be simply another job. If you wanted your cluster to work smoothly, you had to write the code to track everything. Many people brought in third parties with impressive suites of tools. Now Amazon’s X-Ray is offering to do much of the work for you. It’s competing with many third-party tools for watching your stack.

When a website gets a request for data, X-Ray traces as it as flows your network of machines and services. Then X-Ray will aggregate the data from multiple instances, regions, and zones so that you can stop in one place to flag a recalcitrant server or a wedged database. You can watch your vast empire with only one page.

Rekognition

Rekognition is a new AWS tool aimed at image work. If you want your app to do more than store images, Rekognition will chew through images searching for objects and faces using some of the best-known and tested machine vision and neural-network algorithms. There’s no need to spend years learning the science; you simply point the algorithm at an image stored in Amazon’s cloud, and voilà, you get a list of objects and a confidence score that ranks how likely the answer is correct. You pay per image.

The algorithms are heavily tuned for facial recognition. The algorithms will flag faces, then compare them to each other and references images to help you identify them. Your application can store the meta information about the faces for later processing. Once you put a name to the metadata, your app will find people wherever they appear. Identification is only the beginning. Is someone smiling? Are their eyes closed? The service will deliver the answer, so you don’t need to get your fingers dirty with pixels. If you want to use impressive machine vision, Amazon will charge you not by the click but by the glance at each image.

Athena

Working with Amazon’s S3 has always been simple. If you want a data structure, you request it and S3 looks for the part you want. Amazon’s Athena now makes it much simpler. It will run the queries on S3, so you don’t need to write the looping code yourself. Yes, we’ve become too lazy to write loops.

Athena uses SQL syntax, which should make database admins happy. Amazon will charge you for every byte that Athena churns through while looking for your answer. But don’t get too worried about the meter running out of control because the price is only $5 per terabyte. That’s about 50 billionths of a cent per byte. It makes the penny candy stores look expensive.

Lambda@Edge

The original idea of a content delivery network was to speed up the delivery of simple files like JPG images and CSS files by pushing out copies to a vast array of content servers parked near the edges of the Internet. Amazon is taking this a step further by letting us push Node.js code out to these edges where they will run and respond. Your code won’t sit on one central server waiting for the requests to poke along the backbone from people around the world. It will clone itself, so it can respond in microseconds without being impeded by all that network latency.

Amazon will bill your code only when it’s running. You won’t need to set up separate instances or rent out full machines to keep the service up. It is currently in a closed test, and you must apply to get your code in their stack.

Snowball Edge

If you want some kind of physical control of your data, the cloud isn’t for you. The power and reassurance that comes from touching the hard drive, DVD-ROM, or thumb drive holding your data isn’t available to you in the cloud. Where is my data exactly? How can I get it? How can I make a backup copy? The cloud makes anyone who cares about these things break out in cold sweats.

The Snowball Edge is a box filled with data that can be delivered anywhere you want. It even has a shipping label that’s really an E-Ink display exactly like Amazon puts on a Kindle. When you want a copy of massive amounts of data that you’ve stored in Amazon’s cloud, Amazon will copy it to the box and ship the box to wherever you are. (The documentation doesn’t say whether Prime members get free shipping.)

Snowball Edge serves a practical purpose. Many developers have collected large blocks of data through cloud applications and downloading these blocks across the open internet is far too slow. If Amazon wants to attract large data-processing jobs, it needs to make it easier to get large volumes of data out of the system.

If you’ve accumulated an exabyte of data that you need somewhere else for processing, Amazon has a bigger version called Snowmobile that’s built into an 18-wheel truck complete with GPS tracking.

Oh, it’s worth noting that the boxes aren’t dumb storage boxes. They can run arbitrary Node.js code too so you can search, filter, or analyze … just in case.

Pinpoint

Once you’ve amassed a list of customers, members, or subscribers, there will be times when you want to push a message out to them. Perhaps you’ve updated your app or want to convey a special offer. You could blast an email to everyone on your list, but that’s a step above spam. A better solution is to target your message, and Amazon’s new Pinpoint tool offers the infrastructure to make that simpler.

You’ll need to integrate some code with your app. Once you’ve done that, Pinpoint helps you send out the messages when your users seem ready to receive them. Once you’re done with a so-called targeted campaign, Pinpoint will collect and report data about the level of engagement with your campaign, so you can tune your targeting efforts in the future.

Polly

Who gets the last word? Your app can, if you use Polly, the latest generation of speech synthesis. In goes text and out comes sound—sound waves that form words that our ears can hear, all the better to make audio interfaces for the internet of things.

Original article here.


standard

Currated Catalog of Visualization Tools

2017-01-20 - By 

There are a lot of visualization-related tools out there. Here’s a simple categorized collection of what’s available, with a focus on the free and open source stuff.

This site features a curated selection of data visualization tools meant to bridge the gap between programmers/statisticians and the general public by only highlighting free/freemium, responsive and relatively simple-to-learn technologies for displaying both basic and complex, multivariate datasets. It leans heavily toward open-source software and plugins, rather than enterprise, expensive B.I. solutions.

I found some broken links, and the descriptions need a little editing, but it’s a good place to start.

Also, if you’re just starting out with visualization, you might find all the resources a bit overwhelming. If that’s the case, don’t fret. You don’t have to learn how to use all of them. Let your desired outcomes guide you. Here’s what I use.

Original article here.


standard

Tech trends for 2017: more AI, machine intelligence, connected devices and collaboration

2016-12-30 - By 

The end of year or beginning of year is always a time when we see many predictions and forecasts for the year ahead. We often publish a selection of these to show how tech-based innovation and economic development will be impacted by the major trends.

A number of trends reports and articles have bene published – ranging from investment houses, to research firms, and even innovation agencies. In this article we present headlines and highlights of some of these trends – from Gartner, GP Bullhound, Nesta and Ovum.

Artificial intelligence will have the greatest impact

GP Bullhound released its 52-page research report, Technology Predictions 2017, which says artificial intelligence (AI) is poised to have the greatest impact on the global technology sector. It will experience widespread consumer adoption, particularly as virtual personal assistants such as Apple Siri and Amazon Alexa grow in popularity as well as automation of repetitive data-driven tasks within enterprises.

Online streaming and e-sports are also significant market opportunities in 2017 and there will be a marked growth in the development of content for VR/AR platforms. Meanwhile, automated vehicles and fintech will pose longer-term growth prospects for investors.

The report also examines the growth of Europe’s unicorn companies. It highlights the potential for several firms to reach a $10 billion valuation and become ‘decacorns’, including BlaBlaCar, Farfetch, and HelloFresh.

Alec Dafferner, partner, GP Bullhound, commented, “The technology sector has faced up to significant challenges in 2016, from political instability through to greater scrutiny of unicorns. This resilience and the continued growth of the industry demonstrate that there remain vast opportunities for investors and entrepreneurs.”

Big data and machine learning will be disruptors

Advisory firm Ovum says big data continues to be the fastest-growing segment of the information management software market. It estimates the big data market will grow from $1.7bn in 2016 to $9.4bn by 2020, comprising 10 percent of the overall market for information management tooling. Its 2017 Trends to Watch: Big Data report highlights that while the breakout use case for big data in 2017 will be streaming, machine learning will be the factor that disrupts the landscape the most.

Key 2017 trends:

  • Machine learning will be the biggest disruptor for big data analytics in 2017.
  • Making data science a team sport will become a top priority.
  • IoT use cases will push real-time streaming analytics to the front burner.
  • The cloud will sharpen Hadoop-Spark ‘co-opetition’.
  • Security and data preparation will drive data lake governance.

Intelligence, digital and mesh

In October, Gartner issued its top 10 strategic technology trends for 2017, and recently outlined the key themes – intelligent, digital, and mesh – in a webinar.  It said that autonomous cars and drone transport will have growing importance in the year ahead, alongside VR and AR.

“It’s not about just the IoT, wearables, mobile devices, or PCs. It’s about all of that together,” said Cearley, according to hiddenwires magazine. “We need to put the person at the canter. Ask yourself what devices and service capabilities do they have available to them,” said David Cearley, vice president and Gartner fellow, on how ‘intelligence everywhere’ will put the consumer in charge.

“We need to then look at how you can deliver capabilities across multiple devices to deliver value. We want systems that shift from people adapting to technology to having technology and applications adapt to people.  Instead of using forms or screens, I tell the chatbot what I want to do. It’s up to the intelligence built into that system to figure out how to execute that.”

Gartner’s view is that the following will be the key trends for 2017:

  • Artificial intelligence (AI) and machine learning: systems that learn, predict, adapt and potentially operate autonomously.
  • Intelligent apps: using AI, there will be three areas of focus — advanced analytics, AI-powered and increasingly autonomous business processes and AI-powered immersive, conversational and continuous interfaces.
  • Intelligent things, as they evolve, will shift from stand-alone IoT devices to a collaborative model in which intelligent things communicate with one another and act in concert to accomplish tasks.
  • Virtual and augmented reality: VR can be used for training scenarios and remote experiences. AR will enable businesses to overlay graphics onto real-world objects, such as hidden wires on the image of a wall.
  • Digital twins of physical assets combined with digital representations of facilities and environments as well as people, businesses and processes will enable an increasingly detailed digital representation of the real world for simulation, analysis and control.
  • Blockchain and distributed-ledger concepts are gaining traction because they hold the promise of transforming industry operating models in industries such as music distribution, identify verification and title registry.
  • Conversational systems will shift from a model where people adapt to computers to one where the computer ‘hears’ and adapts to a person’s desired outcome.
  • Mesh and app service architecture is a multichannel solution architecture that leverages cloud and serverless computing, containers and microservices as well as APIs (application programming interfaces) and events to deliver modular, flexible and dynamic solutions.
  • Digital technology platforms: every organization will have some mix of five digital technology platforms: Information systems, customer experience, analytics and intelligence, the internet of things and business ecosystems.
  • Adaptive security architecture: multilayered security and use of user and entity behavior analytics will become a requirement for virtually every enterprise.

The real-world vision of these tech trends

UK innovation agency Nesta also offers a vision for the year ahead, a mix of the plausible and the more aspirational, based on real-world examples of areas that will be impacted by these tech trends:

  • Computer says no: the backlash: the next big technological controversy will be about algorithms and machine learning, which increasingly make decisions that affect our daily lives; in the coming year, the backlash against algorithmic decisions will begin in earnest, with technologists being forced to confront the effects of aspects like fake news, or other events caused directly or indirectly by the results of these algorithms.
  • The Splinternet: 2016’s seismic political events and the growth of domestic and geopolitical tensions, means governments will become wary of the internet’s influence, and countries around the world could pull the plug on the open, global internet.
  • A new artistic approach to virtual reality: as artists blur the boundaries between real and virtual, the way we create and consume art will be transformed.
  • Blockchain powers a personal data revolution: there is growing unease at the way many companies like Amazon, Facebook and Google require or encourage users to give up significant control of their personal information; 2017 will be the year when the blockchain-based hardware, software and business models that offer a viable alternative reach maturity, ensuring that it is not just companies but individuals who can get real value from their personal data.
  • Next generation social movements for health: we’ll see more people uniting to fight for better health and care, enabled by digital technology, and potentially leading to stronger engagement with the system; technology will also help new social movements to easily share skills, advice and ideas, building on models like Crohnology where people with Crohn’s disease can connect around the world to develop evidence bases and take charge of their own health.
  • Vegetarian food gets bloodthirsty: the past few years have seen growing demand for plant-based food to mimic meat; the rising cost of meat production (expected to hit $5.2 billion by 2020) will drive kitchens and laboratories around the world to create a new wave of ‘plant butchers, who develop vegan-friendly meat substitutes that would fool even the most hardened carnivore.
  • Lifelong learners: adult education will move from the bottom to the top of the policy agenda, driven by the winds of automation eliminating many jobs from manufacturing to services and the professions; adult skills will be the keyword.
  • Classroom conundrums, tackled together: there will be a future-focused rethink of mainstream education, with collaborative problem solving skills leading the charge, in order to develop skills beyond just coding – such as creativity, dexterity and social intelligence, and the ability to solve non-routine problems.
  • The rise of the armchair volunteer: volunteering from home will become just like working from home, and we’ll even start ‘donating’ some of our everyday data to citizen science to improve society as well; an example of this trend was when British Red Cross volunteers created maps of the Ebola crisis in remote locations from home.

In summary

It’s clear that there is an expectation that the use of artificial intelligence and machine learning platforms will proliferate in 2017 across multiple business, social and government spheres. This will be supported with advanced tools and capabilities like virtual reality and augmented reality. Together, there will be more networks of connected devices, hardware, and data sets to enable collaborative efforts in areas ranging from health to education and charity. The Nesta report also suggests that there could be a reality check, with a possible backlash against the open internet and the widespread use of personal data.

Original article here.


standard

GE Supplies IoT Developer Kit For Predix

2016-08-30 - By 

GE Digital is making it easier for IoT developers to tap into Predix analytics and use machine learning with the release of a new hardware and software kit.

The internet of things took a step toward becoming better at management as GE released its Predix Developer Kit on July 26. It’s a bundle of GE-supplied hardware and software that makes it easier to collect machine data from the internet.

Developers supply an IP address, an Ethernet connection, an electrical socket, and enough programming to indicate what data they want to collect. The Predix Kit appliance automatically establishes the connection, registers its presence with a central version of Predix, and starts transmitting time-series data that may have to do with the temperature, pressure, speed, flow readings, or other data from the device sensors that it’s attached to.

GE Digital owns and operates the hardware and software combination on behalf of the user, who subscribes to its output. That’s for the most secure form of IoT device data collection, based on a GE Field Agent (PDF) piece of hardware using a ruggedized PC. Developers may also opt to use an Intel Edison board with a dual core CPU and WiFi connection, or a Raspberry Pi mother board running an ARM CPU.

Without a Predix Developer’s Kit, an IoT programmer would set up a board to connect to a device, download software for operation of such a device, and build screens to visualize those operations. Then a programmer would have to do the type of programming that can recognize the device monitored and the hardware on which it was sitting, a task likely to take many hours, Mark Bernardo, professional services leader for GE Digital, Americas, wrote in ablog posted Tuesday.

“Predix Kits can reduce that time to less than 15 minutes,” Bernardo wrote.

Predix is GE Digital’s analytics system that can capture and store machine data and then apply analytics to it to learn from it and predict possible trouble points or future failures and how they can be addressed.

It became generally available Feb. 22 at the Mobile World Congress trade show in Barcelona.

Included in the kit is the Predix UI or user interface that GE Digital released in early March, a week after the Predix analytics system became generally available. It includes a set of software components that can be used by an interface designer or developer to create an IoT application that makes use of Predix services.

In addition, GE Digital announced Tuesday that it has opened the first “digital foundry” for IoT applications in Paris. Three more will be operating by the end of the year. Market researcher Evans Data has reported that IoT is attracting developers at a breakneck pace; there are already 6.2 million developing for it worldwide.

[Want to see how GE Predix works with the cloud? Read Microsoft, GE Partnership Targets Industrial Cloud.]

GE has seen 500 developers sign onto the Predix.io website as programmers each week since it became available in February. It counts 12,000 Predix developers as of today, and expects 20,000 by the end of the year.

GE Digital is using Predix to help it better manage electricity-generating turbines in aging Italian power plants.

Bernardo wrote that the Predix Kits would also be good for building applications to collect data from and manage solar energy projects. They can be used in creating mine safety systems that monitor oxygen levels in operations deep underground or monitor other factors in a difficult to work in environment.

They can also be used to help create smart buildings that can track movements and room temperatures inside to improve comfort and security.

Original article here.


standard

18 Software Documentation Tools that Do The Hard Work For You

2016-08-28 - By 

Without documentation, software is just a black box. And black boxes aren’t anywhere near as useful as they could be because their inner workings are hidden from those who need them in the open.

Software documentation turns your software into a glass box by explaining to users and developers how it operates or is used.

You’ve seen documentation tools before, but if you need a refresher, here are examples of 18 tools that can help any software shop – business2community.


Site Search

Search
Exact matches only
Search in title
Search in content
Search in comments
Search in excerpt
Filter by Custom Post Type
 

BlogFerret

Help-Desk
X
Sign Up

Enter your email and Password

Log In

Enter your Username or email and password

Reset Password

Enter your email to reset your password

X
<-- script type="text/javascript">jQuery('#qt_popup_close').on('click', ppppop);