Home » Posts tagged 'decision management'
Tag Archives: decision management
This is a guest blog post by Carole Ann Matignon of Sparkling Logic. You can see more on Sparkling Logic at http://my.sparklinglogic.com/
Decision Management is about combining predictive models and business rules to automate decisions for your business. Insurance underwriting, loan origination or workout, claims processing are all very good use cases for that discipline… But there is a hiccup… It ain’t as easy you would expect…
If you have a neat model, then most tools would allow you to export it as a PMML model – PMML stands for Predictive Model Markup Language and is a standard XML representation for predictive model formulas. Many model development tools let you export it without much effort. Many BRMS – Business rules Management Systems – let you import it. Tada… The model is ready for deployment.
The problem that we keep seeing over and over in the industry is the issue around variables.
Those neat predictive models are formulas based on variables that may or may not exist as is in your object model. When the variable is itself a formula based on the object model, like the min, max or sum of Dollar amount spent in Groceries in the past 3 months, and the object model comes with transaction details, such that you can compute it by iterating through those transactions, then the problem is not “that” big. PMML 4 introduced some support for those variables.
The issue that is not easy to fix, and yet quite frequent, is when the model development data model does not resemble the operational one. Your Data Warehouse very likely flattened the object model, and pre-computed some aggregations that make the mapping very hard to restore.
It is clearly not an impossible project as many organizations do that today. It comes with a significant overhead though that forces modelers to involve IT resources to extract the right data for the model to be operationalized. It is a heavy process that is well justified for heavy-duty models that were developed over a period of time, with a significant ROI.
This is a show-stopper though for other initiatives which do not have the same ROI, or would require too frequent model refresh to be viable. Here, I refer to “real” model refresh that involves a model reengineering, not just a re-weighting of the same variables.
For those initiatives where time is of the essence, the challenge will be to bring closer those two worlds, the modelers and the business rules experts, in order to streamline the development AND deployment of analytics beyond the model formula. The great opportunity I see is the potential for a better and coordinated tuning of the cut-off rules in the context of the model refinement. In other words: the opportunity to refine the strategy as a whole. Very ambitious? I don’t think so.
About Carole Ann Matignon
Carole-Ann Matignon – Co-Founder, President & Chief Executive Officer
She is a renowned guru in the Decision Management space. She created the vision for Decision Management that is widely adopted now in the industry. Her claim to fame is managing the strategy and direction of Blaze Advisor, the leading BRMS product, while she also managed all the Decision Management tools at FICO (business rules, predictive analytics and optimization). She has a vision for Decision Management both as a technology and a discipline that can revolutionize the way corporations do business, and will never get tired of painting that vision for her audience. She speaks often at Industry conferences and has conducted university classes in France and Washington DC.
She started her career building advanced systems using all kinds of technologies — expert systems, rules, optimization, dashboarding and cubes, web search, and beta version of database replication. At Cleversys (acquired by Kurt Salmon & Associates), she also conducted strategic consulting gigs around change management.
While playing with advanced software components, she found a passion for technology and joined ILOG (acquired by IBM). She developed a growing interest in Optimization as well as Business Rules. At ILOG, she coined the term BRMS while brainstorming with her Sales counterpart. She led the Presales organization for Telecom in the Americas up until 2000 when she joined Blaze Software (acquired by Brokat Technologies, HNC Software and finally FICO).
Her 360-degree experience allowed her to gain appreciation for all aspects of a software company, giving her a unique perspective on the business. Her technical background kept her very much in touch with technology as she advanced.
Please use the following code to get a 15% discount on the 2 Day Conference Pass: AJAY11.
Predictive Analytics World announces new full-day workshops coming to San Francisco March 13-19, amounting to seven consecutive days of content.
These workshops deliver top-notch analytical and business expertise across the hottest topics.
Register now for one or more workshops, offered just before and after the full two-day Predictive Analytics World conference program (March 14-15). Early Bird registration ends on January 31st – take advantage of reduced pricing before then.
|Driving Enterprise Decisions with Business Analytics – March 13, 2011|
|James Taylor, CEO, Decision Management Solutions|
|NEW - R for Predictive Modeling: A Hands-On Introduction – March 13, 2011|
|Max Kuhn, Director, Nonclinical Statistics, Pfizer|
|The Best and Worst of Predictive Analytics: Predictive Modeling Methods and Common Data Mining Mistakes – March 16, 2011|
|John Elder, Ph.D., CEO and Founder, Elder Research, Inc.|
|Hands-On Predictive Analytics – March 17, 2011|
|Dean Abbott, President, Abbott Analytics|
|NEW - Net Lift Models: Optimizing the Impact of Your Marketing – March 18-19, 2011|
|Kim Larsen, VP of Analytical Insights, Market Share Partners|
Make savings now with the early bird rate. Receive $200 off your registration rate for Predictive Analytics World – San Francisco (March 14-15), plus $100 off each workshop for which you register.
Additional savings of $200 on the two-day conference pass when you register a colleague at the same time.
- JMP Launches Global Online Store Powered by e-academy, Inc. (prweb.com)
- In case you missed it: December Roundup (revolutionanalytics.com)
- IBM Study Predicts Collaboration, Analytics And Cloud Computing As Drivers Of Growth In Midmarket (cloudave.com)
- StatSoft Releases STATISTICA Version 10 Analytics Solutions (prweb.com)
- In-Depth Hands-on Workshops Delivered By Analytics Experts and Leading Practitioners at Predictive Analytics World March 13-17, 2011, San Francisco, California (prweb.com)
Analyzing data can have many challenges associated with it. In the case of business analytics data, these challenges or constraints can have a marked effect on the quality and timeliness of the analysis as well as the expected versus actual payoff from the analytical results.
Challenges of Analytical Data Processing-
1) Data Formats- Reading in complete data, without losing any part (or meta data), or adding in superfluous details (that increase the scope). Technical constraints of data formats are relatively easy to navigate thanks to ODBC and well documented and easily search-able syntax and language.
The costs of additional data augmentation (should we pay for additional credit bureau data to be appended) , time of storing and processing the data (every column needed for analysis can add in as many rows as whole dataset, which can be a time enhancing problem if you are considering an extra 100 variables with a few million rows), but above all that of business relevance and quality guidelines will ensure basic data input and massaging are considerable parts of whole analytical project timeline.
2) Data Quality-Perfect data exists in a perfect world. The price of perfect information is one business will mostly never budget or wait for. To deliver inferences and results based on summaries of data which has missing, invalid, outlier data embedded within it makes the role of an analyst just as important as which ever tool is chosen to remove outliers, replace missing values, or treat invalid data.
3) Project Scope-
How much data? How much Analytical detail versus High Level Summary? Timelines for delivery as well as refresh of data analysis? Checks (statistical as well as business)?
How easy is it to load and implement the new analysis in existing Information Technology Infrastructure? These are some of the outer parameters that can limit both your analytical project scope, your analytical tool choice, and your processing methodology.
4) Output Results vis a vis stakeholder expectation management-
Stakeholders like to see results, not constraints, hypothesis ,assumptions , p-value, or chi -square value. Output results need to be streamlined to a decision management process to justify the investment of human time and effort in an analytical project, choice,training and navigating analytical tool complexities and constraints are subset of it. Optimum use of graphical display is a part of aligning results to a more palatable form to stakeholders, provided graphics are done nicely.
Eg Marketing wants to get more sales so they need a clear campaign, to target certain customers via specific channels with specified collateral. In order to base their business judgement, business analytics needs to validate , cross validate and sometimes invalidate this business decision making with clear transparent methods and processes.
Given a dataset- the basic analytical steps that an analyst will do with R are as follows. This is meant as a note for analysts at a beginner level with R.
Package -specific syntax
update.packages() #This updates all packages
install.packages(package1) #This installs a package locally, a one time event
library(package1) #This loads a specified package in the current R session, which needs to be done every R session
CRAN________LOCAL HARD DISK_________R SESSION is the top to bottom hierarchy of package storage and invocation.
ls() #This lists all objects or datasets currently active in the R session
> names(assetsCorr) #This gives the names of variables within a dataframe
 “AssetClass” “LargeStocksUS” “SmallStocksUS”
 “CorporateBondsUS” “TreasuryBondsUS” “RealEstateUS”
 “StocksCanada” “StocksUK” “StocksGermany”
 “StocksSwitzerland” “StocksEmergingMarkets”
> str(assetsCorr) #gives complete structure of dataset
‘data.frame': 12 obs. of 11 variables:
$ AssetClass : Factor w/ 12 levels “CorporateBondsUS”,..: 4 5 2 6 1 12 3 7 11 9 …
$ LargeStocksUS : num 15.3 16.4 1 0 0 …
$ SmallStocksUS : num 13.49 16.64 0.66 1 0 …
$ CorporateBondsUS : num 9.26 6.74 0.38 0.46 1 0 0 0 0 0 …
$ TreasuryBondsUS : num 8.44 6.26 0.33 0.27 0.95 1 0 0 0 0 …
$ RealEstateUS : num 10.6 17.32 0.08 0.59 0.35 …
$ StocksCanada : num 10.25 19.78 0.56 0.53 -0.12 …
$ StocksUK : num 10.66 13.63 0.81 0.41 0.24 …
$ StocksGermany : num 12.1 20.32 0.76 0.39 0.15 …
$ StocksSwitzerland : num 15.01 20.8 0.64 0.43 0.55 …
$ StocksEmergingMarkets: num 16.5 36.92 0.3 0.6 0.12 …
> dim(assetsCorr) #gives dimensions observations and variable number
 12 11
str(Dataset) – This gives the structure of the dataset (note structure gives both the names of variables within dataset as well as dimensions of the dataset)
head(dataset,n1) gives the first n1 rows of dataset while
tail(dataset,n2) gives the last n2 rows of a dataset where n1,n2 are numbers and dataset is the name of the object (here a data frame that is being considered)
summary(dataset) gives you a brief summary of all variables while
describe(dataset) gives a detailed description on the variables
simple graphics can be given by
As you can see in above cases, there are multiple ways to get even basic analysis about data in R- however most of the syntax commands are intutively understood (like hist for histogram, t.test for t test, plot for plot).
For detailed analysis throughout the scope of analysis, for a business analytics user it is recommended to using multiple GUI, and multiple packages. Even for highly specific and specialized analytical tasks it is recommended to check for a GUI that incorporates the required package.
- The data analysis path is built on curiosity, followed by action (radar.oreilly.com)
- Using Datasets in KRL (Flickr RSS) (code.kynetx.com)
- R interface to Google Chart Tools (r-bloggers.com)
- How To Get Experience Working With Large Datasets (highscalability.com)
- A portal for European government data: PublicData.eu plans (onlinejournalismblog.com)
- 5 Datasets You Can Buy and Use for SEO (and a few for free!) (seomoz.org)
- Integrated Longitudinal Database Available in Census Centers (kauffman.org)
Ohri’s Theorem on Decision Management regarding which software do we buy-
1) Assuming no budget constraints
If X be degree of appropriateness of software to a particular use-
where 0 is totally bad and 1 is perfect for use.
Then the probability p of the software be selected = P/ Q where P is total number of users who Know how to Use software (like R) and Q is total number of users who dont know how to use the Software (like Macros or R)
As the number of users begins to increase
P/Q converges to Integral of X dx
James is hosting a webinar series on decision manaement, predictive analytics and business rules this fall. You can check out the webinars and register for some or all at https://decisionmanagement.omnovia.com/registration
Here is an interview with James Taylor, a leading consultant and evangelist in the emerging field of converging decision management.
Ajay- Describe your career in science. What fascinates you with reporting on this segment. How would you interest freshmen students in taking up statistics and math courses.
James- I took Geological Geophysics and Mathematics in college but graduated in a year when the oil price was in free fall and never worked in geophysics. Since then I have worked in computers, mostly focused on how they can be applied rather than on how they work. I am not sure I would say that this represents a career in science so much a career enabled by science and, increasingly, watching science.
As far as math goes I actually think the problem is at the other end of the spectrum. Far too many people leave school without a feel for math – it is taught in a very narrow way and leaves far too many feeling that math is something that other people do. In a world with more and more data, and more and more statistics/data-driven decisions this is not ok. We need everyone in business to be able to consume math intelligently, even if they can’t develop mathematical models themselves. Continuing with traditional math teaching in high school and college is just excluding most people and that has to end.
Ajay- What are the various stages of evolution that you have seen in the Decision Management Industry, including the prevailing jargon name.
James- Decision Management applications have been around for years, albeit primarily in the financial services industry. They used to all have their own categories – fraud systems, origination systems, account management systems – and this was the beginning of the category. One of the first things I did at FICO was describe all these applications as a set – recognizing that the same approach and the same cluster of technologies was being used in each case. Back in 2002 I and some colleagues started calling this approach Enterprise Decision Management. Back then most decision management was enabled by these packaged applications and the tools that could be used to build custom applications were talked about separately – business rules, optimization, predictive analytics.
Over the last 6 years the focus on decisions in each of these areas has increased – more rules people talk about managing decisions with rules and there’s more talk of improving operational decisions in predictive analytics and optimization circles. There’s more talk of using the tools/technologies together and a growing range of integrated suites/platforms.
Where most companies stand today is wanting the kind of capabilities that can only be delivered by applying decision management techniques and technologies but they are not yet asking for decision management. They want, for instance, consistent personalized offers across channels but they are not asking for centralized decision management. Based on previous experience I think this will change steadily over the next year or two with the number of companies asking explicitly for decision management capabilities rising.
From a name perspective we have evolved too. Over time it has become clear that the “Enterprise” was misinterpreted as a call for Enterprise-wide implementation of decision management when it was meant as a call from enterprise ownership of decisions. As a result some folks talk about Business Decision Management and I just like to talk about Decision Management.
Ajay- Why is Decision Management more important than say performance management, business intelligence, predictive analytics.
James- I am not sure it is more important. Most organizations need business intelligence to understand what happened in their business and they need performance management to monitor what is happening now. This kind of understanding is important in successful decision management implementations. And decision management is a management discipline designed, in part, to put predictive analytics to work in operational systems.
I do think a focus on decision is vital to all of them, however. If you don’t understand the decisions you are making it is hard for me to see how you can judge the effectiveness of either business intelligence or performance management. And predictive analytics should be even more decision-centric if it is to be effective. So a focus on decisions is a necessary prerequisite and the management of those decisions, using rules and analytics, is a great way to maximize their value in operational systems.
Ajay- What are your views on offshoring 1) High quality research 2) Labor Arbitrage technical work 3) Cost cutting driven
James- Well I think offshoring is an inevitable consequence of an interconnected world. I also think that companies that offshore simply to reduce cost deserve the employee and customer loyalty they will get as a consequence!
I do think that companies should make thoughtful decisions about what to do where, when something must be handled centrally and when it can be pushed to different localities etc. I think that smarter systems – systems that manage decisions explicitly – can help in this and help companies have a real DNA when it comes to decision making.
Ajay- What are the top 5 principles of Decision Management , as you would explain to a class of business graduates and CEO’s
1. Little decisions add up
The day to day decisions that drive operational behavior, customer interactions, transactional systems are more important than the big, strategic decisions beloved of management consultants. Each one seems unimportant but they happen so often that their total value swamps anything else you do. If you get these decisions wrong it won’t matter what you get right.
2. The purpose of information is to decide
Deming has a famous quote that “The ultimate purpose of collecting the data is to provide a basis for action or a recommendation”. The reason you collect data, report on data, analyze data is to make better decisions. Otherwise it’s just a cost. And unless you know which decision you are making, and what will make it a good or a bad one, then all the data in the world (and all the data management or data analysis) will not help you.
3. You cannot afford to lock up your logic
Decision making logic- the policies, regulations, best practices and customer preferences that drive decision making – cannot be locked up in code you cannot read, systems you do not understand. No matter what else might be handled by your IT people, business decision making logic must not be. You must at least be able to collaborate with your IT folks and manage it with them. You must be responsible for this logic.
4. No answer, no matter how good, is static
Organizations must realize that they have to constantly analyze, reassess and challenge their decision making process. The effectiveness of a decision can often not be determined for some time and even a good decision can be degraded by a change in the behavior of a competitor or a change in the market. As such constant challenging of the decision making approach, constant A/B testing or adaptive control is essential if decisions are to remain effective.
5. Decision making is a process to be managed
The way you make decisions is something you must understand, document, automate and analyze. Good managers, good staff, have a good decision making process. Good outcomes might result from luck or circumstance but you don’t want to rely on that. Instead you want to focus on quality decision making processes. And like many repeatable processes, automating decision making makes it easier to analyze and improve it over time.
James has agreed to schedule a free webinar to explain it more fully. Anyone who wants can register at https://decisionmanagement.omnovia.com/registration/pid=74151252469530
Ajay- What does James Taylor do when not in front of a computer, a podium or an airport. How important do you think is work life balance particularly for young people
James- Well I am a parent, a partner and an avid reader and between them those use up most of my non-work time. I really enjoy my work which makes it hard to stop sometimes. I think life/work balance is important but so is enthusiasm for what one does. Perhaps I am kidding myself but I think there is a difference between putting a lot of hours into something about which you are passionate and putting a lot of work into something just to get ahead or to avoid the rest of your life.
Ajay- Do you think BI world is male dominated. What could be the reasons.
James- Yes. The usual sexism of business combined with the average age of BI people (younger groups seem more mixed in general).
Ajay- Green economy and stimulus macro economics. How can both these fields benefit from Decision Management
James- From a macro stimulus point of view I think the key thing is that governments around the world throw money at companies specializing in decision management. <smile>
The green economy, however, is more interesting. Personally I don’t see how smart grids can be made to work without a solid core of powerful decisioning. Green marketing requires personalization and targeting to avoid waste (more decisioning) while helping consumers make better decisions about products based on green criteria needs to be built into shopping engines like Amazon’s if it is to make a real difference. Being green is all about making greener decisions and making systems make greener decisions takes decision management and decisioning technology.
James Taylor is a leading expert in Decision Management and an independent consultant specializing in helping companies automate and improve critical decisions. Previously James was a Vice President at Fair Isaac Corporation where he developed and refined the concept of enterprise decision management or EDM. Widely credited with the invention of the term and the best known proponent of the approach, James helped create the Decision Management market and is its most passionate advocate.
James has 20 years experience in all aspects of the design, development, marketing and use of advanced technology including CASE tools, project planning and methodology tools as well as platform development in PeopleSoft’s R&D team and consulting with Ernst and Young. He has consistently worked to develop approaches, tools and platforms that others can use to build more effective information systems.
James is an active consultant, speaker and author. He is a prolific blogger, with regular posts at jtonedm.com and ebizq.net/blogs/decision_management. He also has an Expert Channel – Decision Management – on the BI Network.
His articles appear in industry magazines, he has contributed chapters to “The Business Rules Revolution:Doing Business The Right Way” (Happy About, 2006) and “Business Intelligence Implementation : Issues and Perspectives” (ICFAI University Press, 2006), and is the co-author of “Smart (Enough) Systems: How to Deliver Competitive Advantage by Automating Hidden Decisions ” (Prentice Hall, 2007) with Neil Raden .
James is a highly sought speaker, appearing frequently at industry conferences, events and seminars. He is also a lecturer at the University of California, Berkeley.
James has an M.S. in Business Systems Analysis and Design from City University, London; a B.S. in Geological Geophysics and Mathematics from the University of Reading, England; and a “Mini-MBA” certificate from the Silicon Valley Executive Business Program at San Jose State University.
You can contact James at email@example.com