Corporate Data Return on Investment

posted by John Myers   | February 22, 2012 | 0 Comments

INFAFeatureGraphic

Informatica Goal: Maximize Return on Data

The theme of last week’s Informatica Analyst Conference was utilizing the “secular megatrends” of information technology to energize data integration across organizations at an enterprise scale.  These megatrends, described as trends we can all agree upon, are the following:

  • Cloud Computing
  • Social Media
  • Mobile
  • Big Data

Informatica’s message is to take these megatrends to “Maximize Return on Data” of organizations.
As you might imagine, Maximizing Return on Data is the concept of applying return on investment themes to the data Informatica touches with their products and services.  Raising the value of the data in an organization by integrating Hadoop data stores, utilizing “agile business intelligence” and removing data quality issues; brings additional value to corporate data and all things being equal, will increase the return on corporate data expenditures.  Similarly, reducing the time to integrate data sources and improve data delivery with an integrated platform approach, “codeless” interfaces, and reducing operational headcount by productizing previously manual processes associated with master data management (MDM), data quality (DQ), etc. amplify the return on those corporate data resources.  Do both and NOW you’re talking about the Informatica Return on Data concept.
NOTE – Not all of the analysts were convinced that you can accurately calculate the exact value or benefit of corporate data resources as well as the true associated costs.  However, if you provide for allowances in accounting practice, increasing the benefit and reducing the operational costs should give the qualitative results that Informatica is touting without the specific quantitative calculations.

Analyst Conference in Review

Over the course of the two day conference, Informatica presented the analyst community with a number of topics.  Here are the ones that I feel bear watching:

Informatica on Hadoop

I got the chance to speak with Alex Gorelik, SVP for Research and Development regarding the Informatica strategy for Hadoop.  In my opinion, much of the 2012 Informatica strategy for Hadoop focuses starts with its HParser technology.  HParser allows for codeless integration of data between traditional data stores and Hadoop ecosystems.  This strategy focuses on leveraging Informatica’s existing core competencies without making a large technology investment in the still developing world of Hadoop.  This follows one of the main tenants of Informatica’s strategy for technology development/acquisition.
While I might have wanted additional information on how to “proactively” manage MDM for NoSQL data stores like Hadoop, I clearly understand why Informatica is focused on supporting Hadoop activities as they integrate with traditional components of the ‘data integration’ stack rather than moving to insert themselves into the inherently “unstructured” ( I would say semi structured) world of Hadoop datasets.

Agile Business Intelligence

The ability to react in “business time” to BI challenges is the focus of the concept of “Agile Business Intelligence”.  Personally, I prefer the term “nimble” since capital ‘A’ Agile implies that you are in alignment with the concepts of the Agile development methodology.
That being said, Rob Meyer, Senior Director Data Integration, provided some great information about Informatica’s ability to execute in nimble BI situations.  Key among these has been its work with HealthNow (registration required) and implementing data access using data virtualization in timeframes that standard EDW methodologies (read “waterfall development”) could not meet.  The ability to overcome these hurdles is key to making BI more manageable and thus attaining a higher value to contribute to Return on Data.

Informatica MDM

Newly “minted” SVP for MDM, Dennis Moore and Ravi Shankar, MDM Product Marketing, walked us through the future of the Informatica MDM strategy.  The strategy for 2011 was based on applying MDM practices to specific domain MDM solutions.  The strategy for 2012 will be based on leveraging the secular megatrends (listed above) to speeding the implementation of MDM solutions across those specific domains.  Reducing this time to implementation and their associated costs contribute to the overall Return on Data messaging.
Unfortunately, much of the future MDM strategy is under NDA.  However, in the near future, Informatica will be making significant announcements that will make this strategy more visible.

Sales Operations and Investor Relations

Paul Hoffman, President Worldwide Field Operations, and Stephanie Wakefield, VP Investor Relations, both gave excellent overviews of how the Informatica watch words of “Focus and Accountability” have contributed to the performance of Informatica over the past year and should impact the revenues and margins for upcoming years.
According to Hoffman, sales planning and discipline will help Informatica achieve both its near-term and long-term sales goals without the type of inorganic “gaming” that many organizations use to justify sales projections.  ( note – that is not to say that informatica will or will not participate in m/a activities in 2012 and beyond.  please see informatica’s standard safe harbor statement for more information … ).  Wakefield described now this type of discipline has helped Informatica avoid pitfalls of other publicly traded software firms over the past 5-7 years.  However she pointed out that this approach was not without its pitfalls as any publicly traded firm can attest with the recent volatility in public equity markets.

Ultra Messaging

Mike Pickett, VP Product Marketing, Jitesh Ghai , Senior Director Product Marketing, and Mark Mahowald, GM Ultra Messaging Business Unit, provided excellent updates on the status of the integration of the 29West acquisition from 2010.
I was particularly impressed with the ability of Informatica to express use cases for Ultra Messaging beyond the standard financial services model.  In particular, I liked the telecommunications and oil and gas use cases that focused more on the ability to utilize next generation data sources rather than speeding up existing data integration/communication tasks.

Informatica Cloud

During the first day, Juan Carlos Soto, GM Informatica Cloud, provided an update on the Informatica Cloud Data Integration business unit.  With a strategy that began in 2006, Informatica Cloud has matured over the past six years to a point where the following performance is common:

  • 160,000 client directed data integration jobs per day
  • 21 billion event/row transactions processed per month
  • 99.9% uptime

And 2012 promises to see more integration between on-premise data integration and the cloud based offerings such as Informatica Cloud.  The choice should not be considered an either/or, but rather which solution offers the best solution for the job at hand.

Data Virtualization

Following the two days of sessions, Ash Parikh, Director Product Marketing, probably will not want to talk with me for a while…. I got up to speed time on both days with him describing the integrated approach that Informatica has to the data virtualization space.  Much like the messaging associated with Informatica Cloud, Informatica’s course with data virtualization is a blended strategy with both standard data integration approaches and data services based integration.
Parikh walked me through how Informatica views data virtualization, not just as a standalone segment of their data integration business, but as an integrated segment with standard data integration and its big data integration with Hadoop.  This approach uses the integrated data governance (i.e. connectivity, reusable transformations, MDM, DQ, data profiling, etc) layer from the standard Informatica Platform and encapsulates end users from the sources of their data projects whether it be standard (ie ETL), virtualized or Hadoop.
Overall, I liked the approach and the messaging for data virtualization in that it is part of a larger whole rather than an island unto itself.  Look for additional information on the data virtualization space from me in the near future with an EMA Landscape Report on Data Virtualization.




Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

  • Bookmark and Share
  • RSS New Research

  • Archives

  • Recent Posts