What's New in KNIME Analytics Platform 3.4, KNIME Server and KNIME Big Data Extensions

This year's summer release, on July 12, 2017, involves a major KNIME® Software update. Here, we have highlighted some of the major changes, new features, and usability improvements in both the open source KNIME Analytics Platform and the commercial KNIME products. Check out this little video too, which demos some of the new features.

You can upgrade from your existing KNIME Analytics Platform 3.x version by choosing the Update option in the File menu or downloading from scratch from the download page.

KNIME Analytics Platform 3.4

KNIME Server 4.5

KNIME Big Data Extensions

See the full list of changes in the changelog


KNIME Analytics Platform 3.4

New Date & Time integration

 

There are now completely revised nodes in KNIME for handling date and time data types. These include more granular data types such as local date and time, durations, and better handling of time zones.

 

Integration with the H2O machine learning library

 

H2O is an open source machine learning and predictive analytics library with a strong focus on scalability and performance. This first version of the KNIME-H2O integration provides KNIME nodes for a collection of H2O’s functionality for machine learning and scoring.

 

KNIME Personal Productivity now part of KNIME Analytics Platform

The KNIME Personal Productivity Suite is now free and open source, the same as the rest of KNIME Analytics Platform. You can use local Metanode templates, call local workflows inside of another KNIME workflow, set up your own version of KNIME’s Workflow Coach, and use the Workflow Diff to show the differences between workflows or different versions of the same workflow.

 

 

 

Wrapped metanode composite view

 

The default view for wrapped metanodes containing quickforms and/or JavaScript views now shows the same view you would see if the wrapped metanode were opened with the KNIME Web Portal. This opens up many new possibilities for interactive data analysis in KNIME. For best performance and interactivity we recommend that you configure this view to use your local install of the Chrome browser.

 

A new version of the Python integration

 

There is a new version of the KNIME-Python integration in KNIME Labs. The new implementation supports both Python 3 or Python 2 and will enable us to continue to make future improvements.

 

 

Logistic Regression nodes are more scalable, faster, and support regularization

 

We’ve completely re-written the backend code for the Logistic Regression learner. Using the ‘SAG’ solver the node is now considerably faster and also supports larger datasets, both in terms of number of rows and features. It also supports regularization and provides a new output table with detailed statistics about the calculated coefficients.

 

 

Audio and speech recognition nodes

 

Now KNIME can listen to you! In this release we’ve added nodes that allow you to work with audio files inside of KNIME and use a number of speech recognition tools. But it’s not just for speech: we have also built in functionality for extracting features such as beats per minute and frequency distributions from those audio files.

 

JavaScript Views

 

We’ve added three new JavaScript views to KNIME:

 

  • Network viewer: for viewing directed and undirected networks. It supports multiple layout and formatting options.

 

  • Sunburst chart: a very useful visualization for sequences of events (like click streams), and we’ve also found some other interesting things you can do with this.

 

  • Stream Graph/Stacked Area chart: a convenient way of displaying the evolution of multiple data series over a particular interval

 

New Cloud Connectors

 

There are two new database connectors allowing you to use KNIME’s standard database nodes to work with databases hosted by Amazon Web Services. The Amazon Redshift connector allows you to take advantage of Amazon’s scalable and fully managed data warehouse solution, while the Amazon Athena connector lets you query collections of files stored in S3 as if they have been loaded into a relational warehouse. We’ve also provided nodes that make it easy to create and destroy Redshift clusters, so spinning up a managed warehouse in the cloud to run some experiments and then cleaning up the warehouse when you’re finished can all be done from inside KNIME Analytics Platform.

 

 

KNIME Server 4.5

“Deploy to Server” and “Open in WebPortal” menu items

 

It’s now even easier to work with KNIME Server from inside KNIME Analytics Platform. You can deploy workflows from your local workflow repository to a KNIME Server with a single click using the “Deploy to Server” option. Additionally, you can open the WebPortal page for workflows that are stored on a KNIME Server via a context menu in the KNIME Explorer.

 

 

Preview of distributed executors

 

We are working on extending KNIME Server to support executing workflows using multiple distributed executors instead of relying exclusively on the compute power available on the server itself. We think this is important new functionality and would love to get early feedback from KNIME Server customers, so we will be making several previews available. The first of these is ready to go; please get in touch with your contact at KNIME for instructions on how to get access.

 

 

KNIME Big Data Extensions

Cloud connectors for common big data file formats

 

The Spark IO nodes now support reading and writing of common big data file formats from Cloud storage systems such as Amazon S3 and Azure Blob Store.

 

Support for Spark 2.0

 

The KNIME Spark Executor now supports the creation and managing of Spark 2.0 jobs. 
With the support for Spark 2.0 we also introduced new Spark Java Snippet nodes that allow expert users to write their own Spark jobs using the new Spark DataFrame API.

Please notice that due to a bug in Apache Spark, the Hive to Spark and Spark to Hive nodes do not work for secured clusters with this release. We are working on a solution for this which we expect to release later this year.

 

 

Many other improvements have been made under the hood – please refer to the changelog.