Tableau Conference 2014 – A Field Report

Tableau_Conference_logoFIELD REPORT

I just returned from the annual Tableau Conference 2014 (affectionately referred to as TC14) and I’m on data overload! The big event took place Sept. 8-11 in beautiful Seattle and I must confess that the experience really energized me. Tableau knows how to throw a good tech event, one of the best I’ve attended! I was impressed by all the planning that must have gone into the show as evidenced by the smooth-as-silk way everything came off. Congratulations on behalf of data aficionados everywhere.

The 7th annual customer conference was THE event for “data lovers, data leaders, data geeks, and data enthusiasts everywhere.” The first Tableau conference attracted a very modest 180 people. This year, a reported 5,500+ people were in attendance, although at times it seemed like many more.

Here is Dan Jewett, VP Product Management giving an overview of the conference on theCUBE:

One consistent message throughout the conference was that data can expand our creative potential as human beings. Data analysis is a creative process. People working with data are emerging as some of the most creative problem solvers of the modern organization. Releasing their creative genius is the most important goal of modern BI strategy. It is becoming clear to more and more companies that unlocking the creative capabilities of their people is an essential aspect of their competitive advantage.

To support creative problem solving, analytics software needs to do four things:

  • Encourage fast experimentation with data – quick prototyping.
  • Speed – the system needs to provide feedback to the creator immediately.
  • Expressiveness – no templates, dialog boxes, wizards that constrain our thoughts. Instead provide a canvas for innovation.
  • Giving people control – when you have an inspired idea, you need to be able to do it yourself.

TC_data_liciousThese areas create a culture of creative analytics in companies, and Tableau says this is the system they’re building.  The company started in 2003, and now has 20,000 customers worldwide.

The conference served as a vehicle for conveying all the next-generation features they’re planning in upcoming product releases. In many cases, the features are the result of customer comments and requests. The new features fall into 7 different areas as listed below. I will describe a number of key features announced at the show for each area in the upcoming 9.0 release (current release is 8.2, with 8.3 due out before the end of the year). Various company executives conducted demos of work-in-progress features, and I heard frequent applause as the big crowd showed appreciation for Tableau’s hard work and wise decisions:

  • Visual analytics – This is one of Tableau’s hallmark differentiators, providing the customer the ultimate canvas for expressing ideas, one that promotes impromptu experimentation. A couple of big steps forward that were announced: more easily creating time-series orientations, entering in “free form” calculations with multiple data sources,  adding calculations to your data model on-the-fly, a new more powerful “calculation editor” using a simple drag-and-drop method, a new “analytics pane” (including forecasting with linear, exponential, polynomial trend lines), a new “table calculations editor,” geographic search, and also radial and lasso selection tools.
  • Performance – Performance is one of Tableau’s highest engineering priorities due to its difficulty, and has a large number of improvements coming. The goal is to target getting results to users in seconds or milliseconds – beyond the Viz Engine introduced in version 8. Tableau’s data engine will allow for getting data in more quickly. They are taking performance to the next level with a massive update to the data engine. You’ll be able to use vectorized operations of modern multi-core processors. Queries will perform up to 4x faster than in 8.2. One demo using NYC taxi database of all rides taken in 2013 – 173 million trips, was done on a Macbook Pro with a 4 core processor. The speed for multiple passes on the data set was extremely fast. Also faster are live connections to a variety of databases like departmental SQL databases, and enterprise-class data warehouses – using parallel query processing to support complex dashboards. Tableau is also working on new technology to scale speed across the entire organization using “persistent query caches” to share prior results on all nodes of the cluster among users.
  • Data preparation – Another goal is to shorten the time it takes to get data into Tableau and be ready for work. “Automatic data modeling” was introduced in 8.2, but they’re working hard to add even more features. For example, the “split” capability will automatically parse a field into separate parts right in the data window, e.g. it can sample the data, identify the delimiters, and create separate columns. Tableau is making the product smarter to automatically detect the structure of an Excel file and identify the field names and data without any input from the user.  The data view will have the ability to “unpivot” and “reshape” the data. They’re also working on optimized data connectors for: analytic databases and cubes, big data in Hadoop, cloud based data, and file based data. They also have an API for partners like Adobe, Alteryx, Informatica to create optimized data extracts with their own data. But there’s one type of data that hasn’t been addressed yet – web services data. With the new “web data connector,” you’ll be able to connect Tableau to internal web services through REST APIs, JSON data, etc.
  • Storytelling – “Story points” was a new feature in 8.2 designed to bring storytelling to the world of data – letting you express ideas that have structure, narrative art, and context. The next chapter in story points enhances the story point navigator to allow for greater storytelling such as the “martini graph” where the stem is the linear part of the story, and the wide part is where you make the story your own. The goal is to provide a creative canvas.
  • Enterprise – Companies are deploying Tableau to 10s of thousands of people now. The company is working on ways to make Tableau easy to administer, integrate and scale. Version 8.3 that will be coming out later this year will be adding support for the Kerberos network authentication protocol. Also, in the past year, Tableau has added new APIs including Javascript API, a data extract API, and REST API. For the business user with Tableau server, the speed of accessing and managing workbooks has been made extremely fast, including the display of search, filter, sort results. Managing permissions has been made much easier and effective. High availability is a priority in the next year in terms of performance and scalability.
  • Cloud – The cloud is near and dear to Tableau as the business runs 100% on the cloud – Netsuite, Salesforce, etc. For cloud analytics to be a real option for a complete BI solution you need 4 things: (i) to be able to connect to cloud data sources, (ii) to connect to business applications in the cloud such as Salesforce, (iii) to connect to on-premises data sources such as Excel spreadsheets and SQL Server, and (iv) to get the analytics where you do your work. Just over a year ago, Tableau Online was launched. As one compelling use case, Sling used the Amazon Redshift cloud data warehouse with Tableau Online without having to provision any hardware at all. OAuth was recently added to Tableau Online to use centralized credentials. Tableau is adding a capability called “Tableau data sync” that will help you publish on-premises data to Tableau Online. Recently added was the ability for interactive dashboards embedded in Salesforce right where people do their work every day. For example, sales people can get a view of the opportunity in Salesforce, together with the usage data from Redshift, and support tickets from an on-premises database – all in the same view inside Salesforce, fresh and updated automatically. Of course, the data is interactive.
  • Mobile – Tableau mobile is an ambitious initiative to make mobile analytics a reality. Much rethought and redevelopment has been made to make Tableau mobile fast and effective in order to give the user the ability to experiment with data. Tableau is bringing custom calculations to mobile and web for the very first time. In addition, all of the drag and drop analytics is being designed to work on all platforms – desktop, mobile, and web. The company is building a new mobile app – Project Elastic, as a new way to explore data on your tablet.  It is fast and fluid even with a lot of data on an everyday tablet. It is as-if the app disappears, and you are alone with your data where you can even drill down into individual data records.

If I had one complaint about the Tableau Conference it was that there were just too many people! Plus, the show was spread out to a number of venues. I think Tableau should rethink the location of next year’s conference as it simply has out-grown Seattle. In recognition of this reality, the Tableau Conference 2015 is scheduled for Las Vegas, sadly my least favorite venue for a tech conference (lingering memories of covering old COMDEX shows).

TC_Tyson1To wrap up this field report, I thought I’d take the liberty to describe my most memorable time of the show – meeting noted astrophysicist Neil deGrasse Tyson. He’s been a personal idol of mine for some time, especially after his Cosmos documentary series aired on prime-time TV (FOX no less). Having done research in his field myself a number of years back, I feel his ability to engage the lay public in a highly scientific discipline acts as an important service toward educating people in the ways of science – especially the scientific method. His keynote talk on Day 2 achieved rock star status, with longer than believable lines, a polished delivery that included chortle-inducing stand-up comedy, and culmination with a standing ovation. Way to go Tableau!

Daniel – Managing Editor, insideAI News

 

Sign up for the free insideAI News newsletter.

Comments

  1. Tableau is good for starters who are interested in building analytical reports with in short time, but it is not efficient in allowing its users to build analytical reports from customized plots or drawings. In technical language they did not provide an option for their users to build their own charts or plots using an inbuilt scripting language.

    If it doesn’t choose to provide a way for users to script their visualizations, it will soon turn out to be an excellent “art piece” as opposed to be an functional tool.