Archive | Best Practices RSS feed for this section

Business Framework for Analytics Implementation

14 Sep

Updated 9/14/20 with new links. It is a bit ironic that I linked to the Dataversity site, and they do not use persistent identifiers to label their data assets, so all my links are dead. Note to practitioners – if you are not using persistent identifiers your institutional knowledge captured in data assets lasts as long as the identifier!

I went looking for this deck as I was having a discussion on governance that is as old as the hills; essentially how do you link data governance activities to the business activity to address – why does data governance exist?

The other discussion that got me looking at this article again was how we go about building an operating model for organizations where the Governance team is doing more than responding to quality requests – how does the team proactively address data issues?

Both of these are tied to the article below. The Hoshin Framework (at least as it is presented below) ties strategic initiatives all the way down to identified data capabilities that can be addressed proactively to support the business strategy. 

A note on the spreadsheet. This spreadsheet is not for the faint of heart. The spreadsheet supports the thought exercise used to shape discussions and your communication with stakeholders. The key point to take away is that the spreadsheet gives you the ability to relate governance budget to strategic goals, funded programs, current project and metrics. Think of it as the audit worksheets – no one ever sees those, and the auditor reports out only the results.

Original Post.

In my previous post I discussed some analytical phrases that are gaining traction. Related to that I have had a number of requests for the deck that I presented at the Enterprise Dataversity  – Data Strategy & Analytics Forum.  I have attached the presentation here. NOTE: This presentation was done a few years ago while I was with CMMI (Now ISACA) as a result it is tied to their Data Management Maturity Model. I talked about analytics, and my colleague on the presentation addressed data maturity.

Also, while I am posting useful things that people keep asking for, here are a set of links that Jeff Gentry did on management frameworks for a Dataversity Webinar. Of particular interest to me was the mapping of the Hoshin Strategic Planning Framework to the CMMI Data Management Maturity Framework. The last link is the actual excel spreadsheet template.

Links:

  1. Webinar Recording: http://www.dataversity.net/cdo-webinar-cdo-interview-with-jeff-gentry-favorite-frameworks/. Here is link to deck.
  2. Link to Using Hoshin Frameworks. Hoshin is bigger than just this matrix, and is a heavy process for most people. However, the following gives you soem background: http://www.slideshare.net/Lightconsulting/hoshin-planning-presentation-7336617
  3. Hoshin Framework linked to DMM: Data Analytics Strategy and Roadmap Template 20160204D.xlsx

Advertisement

The topic of protecting personal information will grow in importance in 2019

19 Nov
IAPP Annual Report 2018
For those interested in the protection of personal information, the IAPP has an interesting – albeit rather hefty – IAPP-EY Annual Privacy Governance Report 2018, and the NTIA has released its comments from industry on pending privacy regulation. I noted that the IAPP report indicates most solutions are still almost all or entirely manual. I am not sure how this does not become a management nightmare as organizations evolve their data maturity to align operations and marketing more. Data management as a process discipline and some degree of automation are going to be critical capabilities to ensure personal information is protected. There are simply too many opportunities for error when this is done manually. 
I recently published an article in TDAN on automating data management and governance through machine learning. It is not just about ML, other capabilities will be required. However, as long as organizations rely on manual processes only, it opens up risk and places the burden on management to enforce policies that are often resisted as they are perceived as a burden on actually doing business. Data management as a process discipline in conjunction with automated processes will reduce operational overhead and risk.

Architecting the Framework for Compliance & Risk Management

24 Oct

Really quick visit to the Data Architecture Summit this year. I wish I could have stayed longer, but I had to get back to a project.

My presentation was on creating audit defensibility that ensures practices are compliant and performed in a way that is scalable, transparent, and defensible; thus creating “Audit Resilience.” Data practitioners often struggle with viewing the world from the auditor’s perspective. This presentation focused on how to create the foundational governance framework supporting a data control model required to produce clean audit findings. These capabilities are critical in a world where due diligence and compliance with best practices are critical in addressing the impacts of security and privacy breaches.

Here is the deck. This was billed as an intermediate presentation and we had a mixed group of business folks and IT people with good questions and dialogue. I am looking forward to the next event.

Agile – we just keep trying to make it work!

3 Aug

In the summer of 2013, I must have been thinking about Agile approaches to development as I wrote two blogs on the topic:

I was interested to see that Martin Fowler released an article on yet another approach to fixing what is wrong with agile; the Agile Fluency Model. The article provides a good comprehensive write up on this approach. However, go back to look at the links in the above blogs. There are a number of amusing ones. This one from Martin Fowler titled Flaccid Scrum, and these two very amusing ones here and here.  They all refer to the same set of challenges facing how agile is implemented.

I am not sure I have anything to add to the debate. however, I do note that successful teams invariably: 1) involve a white board; 2) engage in lively and dynamic dialogue around the challenge; and 3)  have team members with an intuitive user centric understanding of the problems the team seeks to solve.

I guess I am also surprised that we are still talking about how to “do” agile!

Link to agile Fluency Model Diagnostic

Update: Interesting article here by Joshua Seckel 

DGIQ 2018

12 Jul

The DGIQ conference this year went well. I had two presentations, caught up with industry colleagues and customers. It helped that it was in San Diego – and the weather relative to the hot mugginess of the Mid Atlantic was excellent.

My presentation on GDPR was surprisingly well attended. I say surprising in that the deadline has passed, and I find that there are still companies that are formulating their  plans. However, I am beginning to feel a bit like Samuel Jackson.

IMG_2650

In the GDPR presentation, the goal was to focus attention on not only doing the right thing to be compliant, but also doing it right. How do we reduce the stress and overhead of dealing with regulators. We call this “Audit Resilience.”  I spoke to a number of people that are taking a wait and see approach to GDPR compliance. Interestingly even though they are taking this approach, they are still getting requests to remove personal information. It seems to me that if you are taking a wait and see approach, you really still need to be able to remove personal information from at least the web site otherwise, you risk triggering a complaint, and then … you have no defense. Goal has to be to do everything not to trigger a complaint. The presentation took about 15 minutes, and the rest of the time was spent demonstrating the data control model in the DATUM governance platform – Information Value Management.

Also had the pleasure of presenting with Lynn Scott who co chairs the Healthcare Technology & Innovation practice at Polsinelli with Bill Tanenbaum – what we wanted to do was push home the point that collaboration is key when dealing with thorny risk and compliance issues. We tried to have some fun with this one.

I will be at the Data Architecture Summit in Chicago in October. The session will cover:

  • What are the requirements to ensure management is “audit resilient”?
  • What is a Control System and how is it related to a Data Control Model?
  • What is “regulatory alignment” from a data perspective?
  • How do I build a Data Control Model?
  • What role do advanced techniques (AI, Machine Learning) play in audit resilience?

Hope to see you all there

3stooges happy

Will the US evolve towards a GDPR “like” approach to personal information?

3 Jul

CA GDPR Law

In a conversation with a lawyer a few months ago, the comment was made that the US has already implemented GDPR, they have just done small bits of it in each state; collectively similar to GDPR, but no one jurisdiction is anything like GDPR. Except now we have California implementing the California Consumer Privacy Act that will go into effect January of 2020. This regulation is similar in spirit and many details to GDPR. What is fascinating is how the bill was enacted. This article explains how California politics works, and points out that the rapid adoption of the legislation is actually an attempt to create a more flexible environment for companies to negotiate the various compromises that I am sure will come. It is also worth noting that for those companies that are well on the way towards GDPR compliance, they will essentially already be compliant with the California law. I do not see this being the last state to create or update their privacy laws. This was a trend that was already underway. However, California is a big state, and the home of many tech companies, and the State’s new law will surely have an influence on how other States address the privacy issue.

Update 1: Comments on non EU countries updating laws – Canada

https://www.jdsupra.com/legalnews/canada-to-update-data-law-to-gdpr-16052/

Update 2: IAPP Comment on Californian law: 

Audit Resilience and the GDPR

15 May

Compliance activities for organizations are often driven from the legal or risk groups. The initial focus is on management’s position and actions required to be compliant; generally this starts with the creation of policies. This makes sense as policies are a reflection of management’s intent and provide guidance on how to put strategic thinking into action. The legal teams provide legal interpretation and direction with respect to risk. This is also incorporated into the policies. So, what happens next as your organization addresses challenges around ensuring effective implementation and subsequent operational oversight of policies required for General Data Protection Regulation (GDPR) compliance?

THE CHALLENGES

The challenges associated with GDPR as well as other compliance activities are centered on achieving “Audit Resilience.” We define this as the ability to address the needs of the Auditor – internal or external – in such a way that compliance is operationally enabled and can be validated easily and with minimal disruptions and cost. The goal is to reduce the stress, the chaos and the costs that often accompany these events to a manageable level.

WHAT DOES AUDIT RESILIENCE MEAN?

Audit Resilience means that the auditor can:

  • Easily discern the clear line of site between Policies => Standards => Controls => Actors => Data.
  • Review and explicitly align governance artifacts (policies, standards and processes) to compliance requirements.
  • Access and validate the “controls” that ensure standards are applied effectively.
  • Find evidence of execution of the governance practices within the data.

 

CRITICAL SUCCESS FACTORS

GDPR compliance is a function of creating logical linkage and consistency across multiple functions and actors – down to the data level.  Details will vary based on the organization and the assessment of risk.

Overall, the following are critical to successfully demonstrating compliance:

  1. Produce a catalog of all impacted data
  2. Know where data is being used, and by whom
  3. Show governance lineage from Policy => Process => Standard => Control => Data
  4. Report on effectiveness of “Controls”
  5. Produce specific data related to particular requirements such as: Security Events, Notification, Privacy Impact Assessments, and so forth.
  6. Show the relationship of governance tasks to both data and the business processes that use Personal Information.

Another Data Mart?

12 Jul

Martin’s Insights published the article below. It begs the questions – what to do? Clearly a CDP is created to solve an unmet need. The whatever the answer is for any given organization, data must be known “in context” and must be traceable back to its original form to survive scrutiny. Here is the article.

======================================

Recently you may have heard – from your business network or circle of marketing friends – that Customer Data Platforms (CDPs) is the new ‘black’. Can a CDP really be an all-rounded solution to marketing’s most pressing problem, when it comes to enhancing customer experience? Certainly, if you are in the BI field, the concept…

via Trend Alert – Customer Data Platforms — Martin’s Insights

%d bloggers like this: