Archive | Products RSS feed for this section
Link

Another reason why Data Management and Analysts cannot lead separate lives

14 Feb

Another reason why Data Management and Analysts cannot lead separate lives

I found this article interesting in that it points out why the bridge between the data side of the house and the analytical side must be well established – if the data team implements a design that does not support analytics, it has material impacts. I know this is blindingly obvious, but ….

I have recently been in a number of discussions where the attitude was we are going to build the data warehouse using best practices and years of experience, and it really does not matter what you are going to do with the data. I know it crazy, but… you know what I am talking about – we see it all the time.

The article itself tests performance on a columnar versus relational approach to persisting data, and has some surprising results – 4,100% improvement! I would be interested in other studies that have looked at the difference between different data architectures when performing analytical tasks.

Gartner BI & Analytics Magic Quadrant is out…

10 Feb

The report can be obtained here. Along with some industry analysis hear

Well the top right quadrant is becoming a crowded place.

Gartner BI Quadrant 2013

I have not had time to really go over this and compare it to last year’s but the trends and challenges that we have been seeing are reflected in this report; some interesting points:

  1. All of the Enterprise level systems are reported to be hard to implement. This is not surprise – what always surprises me is that companies blame this on one company or another – they are all like that! It has to be one of the decision criterion when selecting one of the comprehensive tool sets.
  2. My sense is that IBM is coming along – and is in the running for the uber BI / Analytics company. However, the write up indicates that growth through acquisition is still happening. This has traditionally led to confusion in the product line and difficulty in implementation. This is especially the case when you implement in a big data or streaming environment.
  3. Tibco and Tableau continue to go head to head. I see Spotfire on top from a product perspective with its use of “R”, the purchase of Insightful and building on its traditional enterprise service bus business. HOWEVER, Gartner calls out the cost model as something that holds Spotfire back. This is especially true when compared to Tableau. My sense is that if TIBCO is selling an integrated solution, then they can embed the cost of the BI capabilities in the total purchase and this is how they are gaining traction. Regardless – Spotfire is a great product and TIBCO is set to do great things, but their price point sets them up against SAS and IBM, while their flagship component sets them up against Tableau at a lower price point. My own experience is that this knocks them out of the early stage activity, and hence they are often not “built in” to the later stage activity.
  4. SAS Continues to dominate where analytics and  Big Data are involved. However, interesting to note that Gartner calls out that they are having a hard time communicating business benefit. This is critical when you are selling a enterprise product at a premium price. Unlike IBM who can draw on components that span the enterprise, SAS has to build the enterprise value proposition on the analytics stack only – this is not a problem unique to SAS – building the value proposition for enterprise level analytics is tough.
  5. Tableau is the darling of the crowd and moves into the Gartner Leader’s Quadrant for the first time. The company has come out with a number of “Big Data” type features. They have connectors to Hadoop, and the article refers to in-memory and columnar databases. While these things are important, and the lack of them was holding the company back from entering certain markets, it is a bit at odds with their largest customer segment, and their traditional positioning approach. Addressing larger and a more integrated approach takes them more directly into the competitive sphere of the big guys (SAP, IBM and SAS), and also into the sphere of TIBCO Spotfire.
  6. It would be interesting to run the Gartner analysis along different use cases (Fraud, Risk Management, Consumer Market Analysis, etc.) In certain circles one hears much of companies like Palantir that has a sexy interface and might do well against Spotfire and Tableau, but is not included here.  Detica is another company that may do well. SAS would probably come out on top in certain areas especially with the new Visual Analytics component. There are probably other companies that have comprehensive BI solutions for particular markets – If anyone has information on these types of solutions, I would be interested in a comment.

More to follow – and there is probably much more to say as things continue to evolve at a pace!

Open Source versus COTS

26 Jan

Public Sector Big Data: 5 Ways Big Data Must Evolve in 2013

Much of this article rings true. However, the last section requires some explanation:

“One could argue that as open source goes in 2013, Big Data goes as well. If open source platforms and tools continue to address agency demands for security, scalability, and flexibility, benefits within from Big Data within and across agencies will increase exponentially. There are hundreds of thousands of viable open source technologies on the market today. Not all are suitable for agency requirements, but as agencies update and expand their uses of data, these tools offer limitless opportunities to innovate. Additionally, opting for open source instead of proprietary vendor solutions prevents an agency from being locked into a single vendor’s tool that it may at some point outgrow or find ill-suited for their needs.”

I take exception to this in that the decision to go open source versus COTS is really not that simple. It really depends on a number of things: the nature of your business; the resources you have available to you; and the enterprise platforms and legacy in place to name a few. If you implement a COTS tool improperly you can be locked into using that tool – just the same as if you implement an Open Source tool improperly.

How locked in you are to any tool is largely a question of how the solution is architected! Be smart and take your time ensuring that the logical architecture ensures the right level of abstraction that ensures a level of modularity; and thus flexibility. This article talks about agile BI architectures – we need to be thinking the same way system architectures.

My feeling is that we are headed to a world where COTS products work in conjunction with Open Source – currently there are many examples of COTS products that ship with Open Source components – how many products ship with a Lucene indexer for example?

Link

Is this the rebirth of Sybase?

2 Jan

Is this the rebirth of Sybase?

Sybase + Hana a potentially powerful combo?

Business Rules Engines & “Prescriptive Analytics”

24 Dec

Someone asked me the other day about how to best evaluate business rule engines (BRE) or business rules management systems (BRMS). The following were the quick notes. BRE are part of a solution. I like what this month’s INFORM magazine said when talking about “Prescriptive Analytics” (page 14); which I find falls into the same category of “Adaptive Analytics”. They define prescriptive analytics as follows:

“Prescriptive analytics leverages the emergence of big data and computational and scientific advances in the fields of statistics, mathematics, operations research, business rules and machine learning. Prescriptive analytics is essentially this chain of transformations whereby structured and unstructured big data is processed through intermediate representations to create a set of prescriptions (suggested future actions). These actions are essentially changes (over a future time frame) to variables that influence metrics of interest to an enterprise, government or another institution.”

With that in mind to frame a discussion of BRMS…

Why was the BRE created? Different companies have approached things differently, which has resulted in differing feature sets. TIBCO has a rules engine that was built around their Enterprise Service Bus (ESB) business; SAS has a rules engine feature that is organized around processing rules within the construct of the various analytical packages (money laundering; fraud); Streambase has a rules engine optimized around applying “real time” rules in streaming data within the financial trading data.

Many BRE / BRMS started as workflow or content management tools, and conflate the idea of a rules engine with workflow related management. This is important to understand as certain applications of BRE need to be extracted away from the workflow management aspect of things

The above tends to frame why BRE products are built the way they are. However, there are a core set of capabilities that exist within all rules engines:

Rule Management. Does the BRMS provide the rules authoring environment that is appropriate for the particular installation? One is looking for the ability of the rules authoring approach to support a level of abstraction around how business entities are identified and how rules are applied to those entities (Business Entity Model).

How does the BRMS support the ability to evaluate current rules and re-use rules, or portions of rules? This is related to how the BRMS supports the evolution of rules in order to optimize them.

Processing functions: When creating rules, what functions exist within the rules authoring component of the engine that can be called on by the user? This discussion falls into two categories:

  1. Functions that are universal – think of these as functions that are similar to those available excel: sum; count; average; absolute value, etc.
  2. Functions that are external to the rules engine but whose invocation is handled by the tool.

The latter is perhaps most important from an analytical perspective in that few rules engines appear to have capabilities to apply a complex rule based on a native set of functions. Additionally, the ability to call external functions, or integrate non-BRE logic, impacts a BRE’s ability to deal with file based data and the associated processing that is part of the Hadoop / MapReduce approach to the Big Data challenge.

How rules are triggered. How does the tool support rule management from a processing perspective? In complex event processing there needs to be a way to manage when rules are executed. This becomes important when a collection of rules must be executed only on receipt of all the data required. This particular perspective is framed by how the solution has been architected. In general, there is a desire to reduce the input / output activity with the data source(s) if that activity does not produce an actionable result.

Sourcing the input. What functions are native to the BRE that provides access to data? Are there connectors to all the major databases; unstructured sources; web sites (open source); file systems? This is perhaps less of an issue than it used to be as the notion of data virtualization has evolved and tools are available to easily format data for delivery to the BRE/

Interfaces. What interface capabilities exist to all of the BRE components? Does the BRE have well published API’s that expose the functions required to effectively integrate the tool into the overall workflow. The ability to reach into the system to expose information of interest is critical. This is most important from an operational perspective. In the management of complex event processing rules, the ability to look into the system and create a simplified view of what can be an overwhelming level of data is critical.

See also:

  • December Informs magazine article on analysis of imagery and “Presecriptive Analytics” This frames the discussion on the combination of capabilities that needs to exist in a decision support capability for complex and adaptive problems; a rules based approach is one f the capabilities.
  • Semantic Reasoning
  • Complex Event Processing

Link

Palantir’s marketing machine never ceases to amaze me!

11 Dec

Well Palantir has done it again – check out this article . I like this product, and the folks that built it have done a good job. however, it never ceases to amaze me at how well they market. Have you been to the web site – www.palantir.com/ ? Plenty of videos and glossies, but little technical information – A polar opposite to the SAS web site – www.sas.com/ . I am not sure of the technical or product underpinning of the Distributed Common Ground System (DCGS) that Raytheon put together, but it appears that it may be heading the way of other “Big Bang” approaches http://www.raytheon.com/capabilities/products/dcgs/ . If anyone has any insights into the products underlying the DCGS let me know. But back to my point – Palantir gets a shout out from the Army, and they are the only company mentioned – nice. You cannot pay for that kind of marketing.

%d bloggers like this: