Itil Process Maturity Framework Pdf

Share this post with.I'm looking at a 'classic' process maturity assessment done by a consulting firm for a client, and what a useless document it is. I'm not saying who sent it to me or why or where from. That isn't important here because so many assessments are similar. Compare yours.The report analyses 8 practices.

It doesn't say why those eight. ITIL has 27 or so, COBIT about 40.

They are a typical eight: Incident, Request, Problem, Change, SACM, SLM, Knowledge, Catalogue.It tells the client they suck. Maturity not much above 1 in all of the practices. Is this a problem? What are the risks? Does it matter at the client site?

The report doesn’t say.It offers five high-level recommendations and about 8 recommendations for each of the 8 processes. That's nearly 70 recommendations, all of them hard. It offers no way of prioritising them and no roadmap for addressing them. That's in the next paid engagement. So what have you told me? That we suck. We knew that - that's why you are here.

How much we suck, and what a huge task we have ahead of us to not suck. Well, that's really going to help launch a programme.There is zero discussion of the organisational context, what their goals are.

Why do they need to improve? What do they want to achieve? There is very little discussion of any conditions specific to the site.The only positive I can find is that the assessment used the ITIL PMF (Process Maturity Framework) which has dimensions of vision, culture, people and technology as well as process.You've just spent 10% of your improvement budget for a slap in the face and you are no further forward. People pay for this?I'm convinced that in most cases ITIL maturity assessments are a useless waste of money. Most consultants crank the client through a generic sausage machine that takes no account of the client's own goals and priorities. Many assessments stop short of offering any more than a kick in the teeth. Value is extra.

Capability maturity when deciding what to do until we understand what maturity we need and why. Risk and value are far more useful first metrics than maturity for designing or measuring improvement programmes. And many assessments don't even measure capability maturity (TIPA, PMF, ISO20000 do): they measure management maturity using CMM, which is yet another step abstracted from useful reality. We usually use ITIL as a reference framework (a best-practice benchmark) for improvements, so we shouldn't also be using it as a measurement instrument. About how cultish it is to measure the improvements made by using a body of knowledge (BoK) by measuring with the same BoK.

You sucked at ITIL but by using ITIL you now suck less at ITIL. Are we delivering more value? Are customers more satisfied? Did we cut costs?

Have we reduced organisational risk? These are more meaningful metrics. The reason why all of these 'tools' and 'approaches' fail as they only paint part of the picture and a disconnected one at that. Also there are range of methodologies which has resulted in multiple silos in terms of standards, methodologies and their application.To eliminate all of this we have taken an approach that captures a full Enterprise Reference Architecture, its layers, etc. Also this approach enables the ERA to be viewed from multiple perspectives and has the benefit of linking all the components in an enterprise together in a meaningful way. Change one thing and it cascades and impacts on other areas.Without a holistic view you work on 'parts' of an enterprise in splendid isolation and generally in a meaningless way. Very interesting point and in general we agree conceptually in the ERA.

If you have that then you can look at how to optimise your processes. We also prefer this as a strategy than Continuous Service Improvement because knowing the process interactions means we can look for ways to apply effective technology to optimize those processes, or 'simply' change those processes.That being said, an Enterprise Reference Architecture by its very own nature (it is in the word 'Enterprise') will be impossible to achieve.Our companies are spread over multiple geographies, with lines of business that intersect and share. The act of defining the ERA in the first place will consume a significant proportion of our resources (i'm thinking of the outsourcing discussion above) to accurately model and even then will have changed before the model is completed.Sorry, nice idea. No cigar from the ITILosaurus.

ITILosaurus,Not quite sure any more what you are driving at. Most of your comments indicate sensible thinking and a lot of bad experience working within frameworks termed as ITIL. However I don't really see any suggestions as to what to do, rather what things are useless. I am not sure what then to make of them i.e. Where an I lead.Like Skep I find myself agreeing with a lot of statements but feeling like the criticism that comes back is more about how ITIL (terms/concepts) are understood and implemented (often not very well), rather than with the concepts themselves.Is it a sign that much of ITIL has been picked up and implemented in the real world too many times in a certain way that it now lives its own life and synonymous with less-then perfect practice? I am careful not to use the term 'bad practice' as I consider these attempts to leave many organisations in a better state than without trying at all - even if they don't end up with 'best practice'.

Hi Skep,You hit the nail on the head, which is why we created Service Improvement Manager (a cloud-base DIY continual improvement tool) to combat just this very sort of thing The problem:- Many consultants use their own spreadsheets that are totally inaccurate (little understanding of good practices in ISO/IEC 15504, CMMI-SVC etc.)- One client recently complained of a previous consultants report. Given a Level 4.0 maturity, but the processes weren't even documented!?!?!- Recommendations are really vague and not targeted to specific business goals, priorities or maturity targets (no Why? WIIFM?)- They recommend doing all ITIL processes, but for no specific reason.

No understanding of the value that a targeted 20K SMS scope can bring either.- Way too many consultant-based assessments worth mega $$$ that don't deliver much/any value at all.So instead, with SIM we gave our clients a low-cost DIY assessment and continual improvement tool that:1. Lets you self-assess your processes (Compliance, Capability and Maturity)2. Scores show Current State + Short-term and Long-term Targets (based on the improvements generated)3. Improvement tasks automatically generated based on business + process + gap priorities (to get you to the next level maturity)4. Build an improvement initiative with a fully costed business case - know what to improve, how much, benefits realisation, ROI/NPV/Payback5. Track and manage the improvements.

See them get checked off against the last assessment.Hope that helps.Michael Editor's note: I normally remove vendor-promotional comments, but this one is apropos enough to stay. Please don't think this sets a precedent:). From 1930 to 1956 this ship was a training ship for sailors in Finland.

Even after it stopped sailing, it served as a school for sailors. There must have been people in charge of the sailors training like Rob who thought new technology does not change frameworks. Reef the mainframe and haul up your incidents!Now it is probably just good that I cannot remember any good word for describing those people who thought that teaching people to sail a frigate was a necessary part of sailor's training in the 20th century;)Aale. Firstly - we're ITIL Septic (not skeptic) because we believe ITIL is septic. I would love to agree with you, but I find myself agreeing with my esteemed ITILosaurus collective member who commented yesterday.To pick on MTTR is to pick on one context.Read the line 'Continuous Service Improvement'Now think about that statement.We are in this brave new world unable to define a single 'Service'. We cannot define or maintain the necessary relationships within a CMDB because our infrastructure is tolerant to single faults and thus continuously adapting.

(When a link fails, the VLANs using that link are switched to an alternate link. When a VM fails, the applications are switched to an alternate VM).

Mostly our users are unaffected and do not even notice the bait and switch.How can we maintain a relationship model of all those circumstances? If we cannot model these Services, how are we to firstly measure them and then continuously improve them?Should we then look at our platform as a whole? Too big.Should we break it down into elements? Well that is what Problem Management does isn't it?

Looks at the entities that comprise our platform and reports on their reliability, enabling us to have meaningful discussions with our suppliers.Where does that leave Continuous Service Improvement pray tell?MTTR is a red herring. Continuous Service Improvement is a waste of time (and money) and is similar in concept (as my fellow ITILosaurus member mentioned) to Total Quality Management (for those of us old enough to remember that consultants wet dream in the late 1980s and early 1990s).Time is not a great healer it appears, time is a great recycler. What happened to all the TQM consultants? They became ISO9000 consultants.

What happened to them? They became Y2K consultants. What happened to them.ITIL.where next? Don't you love this world. I have seen and experienced a significant number of organizations that don't understand processes.Think about this. In an outsourcing environment, the outsource provider is the lowest bidder.

(Can you say MINIMALIST) The contracts are based on LOE slots which means its all about bodies. The contracts RARELY have any incentives toward measuring and optimizing processes, measuring effectiveness, or even DOCUMENTING current processes. Many outsourcing companies just won't do it.So, all in all, you get what you pay for. After all, it is all about cheap and bare minimums and NOTHING about making things better.

And, as long as management follows are manages to this, it will only get worse.I also note that there are alot of Director level management types that are Sales oriented but lack Engineering discipline or operational exposure sufficient to make good decisions. Some of the signs you see right away are products that have no users. Products that are partially implemented but deliver questionable value.

PoliticalWare.These Design by Glossy Directors can be utterly destructive to operations. They throw tools at Operations, then proclaim victory. Only toneverrealize value or even personal integration. In the end, CFOs usually catch wind of the spend- spend - spend Director and they get IXNEYED. Others leave after a few years only to leave a trail of tears over the years.Not everybody has evolved into providing IT Services. If this is your team, ITIL is probably not going to help you with anything other than provide a common language.

And even then, if it doesn't fit your minimalist, body shop approach, you will inherently pick and choose which functions you think you support.ITIL provides several key elements:a common Languagea Foundation of functionsa starting point for you to fill in the process blanksITIL is a methodology. A Philosophy. Not a technology.Another part that makes ITIL implementation difficult is that the tools and utilities are designed for a single user. For example, you are assigned a ticket. If someone else needs to work on this, you transfer the ticket or you create secondary or child tickets. 2 people rarely work on the same ticket. And if they do, only one person is able to update that ticket.The tools are not 'Collaborative' or team enabling.There are tools on the horizon that will supercharge your teams through the enablement of information, tools, and collaboration.

You have opened the kimono on Outsource contracts. YOU ARE SO RIGHT!!! This is another elephant in the room that no one discusses.

It's not appropriate for the ITILosaurus Collective.but perhaps we should discuss it there because it makes us all mad!CFO's and Accountants (read: Boring) are bonused on reducing the apparent bottom line.Our 'friends' at Outsource company TLA (let's face it, they're either two or three letter acronyms) are bonused on profit.Our accountants want to get FTEs off their books as quickly as possible.Our friends want to get us to do the deal as quickly as possible.There is never the time to spend, or the right resource engagement to accurately document what 'we' actually do in all of its splendor. So we miss things and our 'friends' bank (straight to it) on us missing things.The consequence of outsourcing has two detrimental affects on our business.

Immediately we end up in Change discussions (read additional cash commitments), AND, every thing slows down. To get something done now requires multiple signatures, interactions, umming and ahhing, risk appraisals, etc etc etc.In the old days we walked up to one of our staff and asked 'do you have the time to do X?' They would say 'when do you need it by?'

We would agree a timeframe and kerrpow.it would be done.Now when I walk up to the same person, they are a bottleneck, shackled by contractual barriers and timesheets.To bring this back to ITIL. ITIL puts us on the path to Outsourcing.

Outsourcing is a blight on our collective businesses. (Not commodity outsourcing, operations outsourcing).ttfn. I agree with most of what has been said around this topic. Do IT departments want to know how they are performing against a framework (note I did not say standard) - of course they do. Will the assessment show if / how they are delivering value to the business - typically not. CobiT assessments are the same - opinion against framework.

Ass essment against standrads ISO20k, 27001 etc are pretty straightforward - you either have the process / evidence or you do not - although I guess there is the case to be made for 'we are in the process of establishing'. However, the one thing I believe has been missed - but was alluded to - is the role the consultant plays in any assessment. Organisations are asking you as the consultant to use your experience, skills, knowledge to help them identify where they are now against what they want to be (perhaps include this as part of the assessment) and what they need to ado to achieve.

The maturity level as identified by the assessment but if you as they consultant believe it should be higher and you can explaun why (include in the report) then you can award a higher level of maturity. We should not blame the tool because the craftsman does not have the experience and skill to use the tool. Hi Dave, I think the consultant has a greater responsibility to understand what the customer actually wants and needs in their assessment, but commercials often constrain this.I know Pink (in common with its competitors) is familiar with the concept of in essence fixed price model-based assessments, and it's this kind of assessment that can lead to the worthless report Skep brought to our attention precisely because it's essentially OTS with little heed for what the customer actually needs. But it's low value work with a high up-sell opportunity so it's attractive to the consultancies; but this will in turn limit what their consultant will be able to say in their report - and why report everything's rosy if there's a commercial opportunity to say it isn't?It's actually low value to the customer as well, and their business, unless they specifically want to know how they're doing against someone's interpretation of the ITIL books. Let me give you one example, in one such model addressing the question of Infosec ownership the CMM level 5 was 'There is a dedicated Information Security Management team who control and manage all aspects of access requirements.' Well if the organisation couldn't support a team the assessment would never show they are doing as much as they are able to do, which may well be more than the business requires anyway.Typically such assessments aren't performed by the most capable consultants, which means essentially that the assessment is only as good as the model, which in turn is only as good as the boundaries of what it is measuring, and if that's limited to ITIL then it's only ever going to answer one question - how do you think I'm doing against ITIL? And even that's subjective.

So we're back to that question again, and ultimately however much a consultant might want to expand the question to one closer to customer value they will inevitably be suspected of gilding the lily.That said the best thing a consultant can do, probably at the pre-sales point, is explore exactly what the customer wants from their assessment, and then tailor accordingly. I've come late to this debate and as always it then becomes hard to find the right point to dive into the conversation.I have on my desk two ITIL assessments form two very different organisations, separated by around six years, but both carried out by the same highly regarded ITIL consultancy.Pretty much the only thing that is different between them is the name of the client and the list of people they've thanked for providing information.Why do so many ITIL/ITSM intitiatives fail? Because they are based on such shaky foundations.An idea I've used in two past roles when carrying out assessments is based on how I used to work as an Internal Auditor, rather than coming at it from an IT centric perspective.To save you reading the 1400+ pages of Sawyer's Modern Internal Auditing the approach is very simple.

Needless to say the basic elements will be familiar to anyone with COBIT from the audit end of the spectrum.1) Establish the objectives the 'system' (service in our case) is there to fulfill.2) Use that to derive a set of necessary sub-objectives and from those derive a set of key controls, which in turn let you construct a set of key control questions.This is where the auditors/consultants experience and ability to judge risk come into the equation. Had I been doing this as part of an internal audit programme in a large organisation I would already have done a risk based audit needs assessment to help me decide how much effort I was going to put into each assessment.3) Having carried out a quick on site to get an initial set of answers to my my key questions (and I tended to keep these to less than 20 if I could) I would then normally be fairly clear if a) The system appeared fit for purpose or b) If I was the manager I wouldn't be sleeping at night. Generally I find the 'How well does the manager sleep?' Test quite effective when mapped on to how well the manager should sleep - a manager who sleeps soundly when running a system that should clearly be keeping them awake at night is the worst case scenario. If everything appeared to be in order my next set of questions would try to determine whether that was through luck or because the controls were genuinely robust. Jim,It's a great, top down approach to focus on the overall goals of the system and then derive the necessary sub-objectives.

Agree 100%.But how does that relate to the long march through traditional capability maturation? I've.read. CMM and CMMI both in some depth and just don't see them starting from a systems point of view.

They stipulate an abstract model of process maturation and then apply that to specific decomposed practice domains in product (CMMI-DEV), sourcing (CMMI-ACQ), and service (CMMI-SVC). They do very little to discuss the overall goals and characteristics of a product, sourcing, or service system, except (by implication) as the combination of the practice areas they list. Which is a complete violation of systems theory and its emphasis on emergent behavior.Realize that you weren't talking about CMMI above, and the thread didn't really even start there, but whenever I hear the term 'process maturity' I go first to CMM/I as the most well known example of 'how to do this.' Am I missing something?Charles T.

Charles,I think you've summed it up in your previous post. The approach I'm suggesting is driven purely by the desired outcomes not by the artifical constraints of the CMMI model, and as I said in my earlier post the intention of the assessments I first established were to place systems into one of three very basic pots: Fit for purpose, Not fit for purpose and So bad the system isn't auditable. At a later point I layered CMMI like maturity layers on top of it because that is what the market asked for.Let me be cycnical about the market for a moment.

The main reason people want ITIL assessments is to prove they are no worse than anybody else, and to convince themselves that they don't need to fundamentally change what theya re doing. Hence the comfort they derive from being told they don't need to strive towards level 5.At Quint, in the early days at least, we made use of the Stadia model and worked with the concept that in the real world there are probably a small set of discrete stable states that a service eco-system can be in. It was this thinking that first led me to conclude that the apparent results from assessments didn't make sense.For a maturity level model to have credibility it seems to me that it must be rooted in empirical evidence a) that it represents real world states and b) that the progression between states follows a set order.James Finisterwww.tcs.comhttp://coreitsm.blogspot.com/. James -'For a maturity level model to have credibility it seems to me that it must be rooted in empirical evidence a) that it represents real world states and b) that the progression between states follows a set order. 'Brilliant.I'd also like to see empirical evidence supporting any given set of proposed process areas.

Assessment

I'm not aware of any research along those lines. It would have to involve some linguistic or anthropological approach.Otherwise we wind up with oddities like CMMI-DEV having four (count them) process areas related to 'project':- Integrated Project Management- Project Monitoring and Control- Project Planning- Quantitative Project ManagementI don't see how these can be mutually exclusive, which is (I think) a hallmark of any solid framework. How did they derive that particular decomposition (and other other process areas)? What is the basis for it? Why is it optimal as compared to having one larger project management process area encompassing all four?As always, if someone can point me to some research where these things were done, I'd be appreciative. I keep thinking I must be missing something, that there must be some detailed, researched justifications underpinning CMMI. I'm less interested in research covering where CMMI has been applied, that then says 'the following benefits were seen.'

The idea that a service system has a relatively small number of stable states echoes my own thinking of late. I'll have to check out Stadia.Charles T. I hope you don't mind me picking up the thread here.Your point is exactly why I am frowning when exposed to any discussion around the number of processes in the ITIL (2011) framework. As far as assessments go I should have mentioned that COBIT does a lot of the hard work for you in terms of formulating questions, whilst still requiring the auditors knowledge and experience to use those questions effectively.The innovative/clever element I tried to build in was to provide guidance on identifying the key indicators about the health of the ITSM eco-system. In my days as a real auditor the standing joke was that if ever a manager said 'I run a tight ship' which was a common phrase in those days you could guarantee you would find major failings if not a fraud.

'Our change success rate is 100%' seems to be the ITSM equivalent.And talking of change - if you want to start thinkign through possible applications of ToC to ITSM then change is a good place to start.James Finisterwww.tcs.comhttp://coreitsm.blogspot.com/. CaryQuick comment - audit against a standard or inspect against a regulation - sure. But don't audit against the USMBOK - please - or even assess.

ITIL got hammered and rightfully so, for suggesting you meaningfully assess an organization's 'capability' against the books. As Skep has likely said - what a load of bolloqs.

ITIL has bits missing, rather like your old grandfather. Its not all there, and doesnt work the same way when it was young.If you want to trumpet your organization has a level of maturity against a framework - knock yourself out. Just be expected to be viewed across the room like I did my grandfather - bit cooky.

It you assess anything - do it against how you are helping the customers you serve succeed. As I may have said early (senile moment?) maturity by what criteria - completeness against a specification, age and wisdom? I prefer customer outcomes and levels of satisfaction.First test then - does the framework explain in terms my dog could understand the basis for customer satisfaction and how it can be measured and managed. Is than an echo I hear. Are we not preparing to throw the baby out with the bathwater here though?Happy to agree that ITIL process assessment never should be considered to mean something they are not, and that too often they are misused or the expectations are way off.But are we not now at a risk of going further than we should be by discounting all value associated with process maturity assessment in the ITIL space just because of it?For my money, ITIL process maturity assessments never should be expected to be delivering the same assurance that for example an all-encompassing IT audit looks to validate.

It is called a process maturity assessment, not an audit of the IT function. Even if we are not going out to the wider ITSM space, just remaining within the ITIL coverage, even in those books it clearly expects an undertaking of the service provider to understand the embed value creation and delivery to the end customer: it is mentioned in the context of Strategy, Design, Transition and CSI that you'd need to ensure the processes are right and linked to supporting the services provided to the customer (which ITIL now defines of course as providing the value to the customer without the ownership of risks etc etc). It is not absent in the theory of ITIL - so you can argue that if ITIL theory is followed properly, that SHOULD ensure a linkage to the end customer outcomes. But that is just a side note.My main suggestion is to consider (well executed and properly scoped) ITIL process capability maturity assessment as a piece of the puzzle when it comes to building the IT function's balanced scorecard. Following the BSC build methodology, the process maturity will be a leading indicator for a number of other scorecard aspects and ultimately there must be a linkage established to customer (business) value delivered. Within an overall frame of understanding if things are going in the right direction, I believe process maturity assessments do have a place. But that means understanding the context and actually making the effort to work through those contexts to ensure the balancing is right.This is very different from an audit approach which if done well will cut across the whole management system to understand if this is built right and I don't think we are fair trying to expect the process maturity assessments to provide the same output.

Even within the narrow scope of ITIL (and nice to hear someone reminding us it is narrow), the Tipu method looks at Risk and Value as the primary metrics for assessing current state and prioritising improvement (in this case Value meaning outcomes delivered that align with business goals/objectives). Capability/maturity is of little interest - it doesn't mean much. Skep -Exactly what I've been driving at. I'll have to look @ TIPU one of these years.I'm going to tell a story. In 2004, as an application manager for a Fortune 500 shop, I was told to send some work overseas to the offshore CMM Level 5 team.I devised a little test.

We found a trivial module that needed coding in Visual Basic 6. I wrote up one page of simple coding guidelines.

The first guideline was, 'You must not use global variables. If you use a global variable, the deliverable will be rejected.'

Not an unusual or extraordinary coding standard - quite the contrary, global variables are notorious red flags for amateur code.The code came back completely based on global variables.Since then, I have been skeptical of the concept of capability maturity, at least as it has been developed in the U.S. After reading Theory of Constraints and exploring systems theory, I came to realize that it was likely quite harmful from those perspectives.Perhaps there are some more limited uses. The idea of a staged maturity model probably will never go away; heck, our education systems are based on such. (Hmm, maybe that's the problem.)Some practices do depend on mastering other practices; you can't learn calculus until you know algebra. In the past, some things I've wanted to do in a given organizational context weren't possible until we did some other things, and that entire process reasonably could have been called a 'maturation.' Some of us have seen certain maturation sequences frequently enough that they could be called patterns (which is an under-utilized approach IMHO, much less pretentious than a 'standard.'

)But the reasoning behind some standards' maturation sequencing is impenetrable. Heuristic, at best.

Certainly nothing to audit against. And when the sequencing is combined with a functional decomposition (as in CMM/I), the result I think is to strengthen silos and sub-optimize the whole.Charles T. It seems to me that there are different kinds of assessments - with different levels of value.There's the vendor assessment, one that the vendor sends some staff to do for a day or two at a very low cost.

These, generally, focus on the perception of some key staff. It is accompanied by some cool spider diagrams, etc. It documents the staff's perception. Designed to generate sales.At the other end of the scale is what Ian is discussing. An audit against some standard like ISO 20k or best practice like USMBOK.As Charles points out, some of these don't examine customer service performance evidence, customer satisfaction, costs, etc.A comment in the stream is that customers hire consultants to answer a question. They must want the question answered.

They're paying. So answering the question must have value for them at the time.It seems to me that there may be quite a bit of value in periodically documenting advancement and setting a new baseline.

I conceive there may also be considerable value in the producing actual evidence of performance statistics, customer satisfaction perceptions and costs for analysis. 'actual evidence of performance statistics, customer satisfaction perceptions and costs for analysis' would be great. You won't get those from an ITIL maturity assessment.And the only baseline you get from an ITIL maturity assessment is a vendor proprietary one that reflects the vendor's perception, as you point out.There are several forms of assessment but only one kind of ITIL assessment - a proprietary measurement of a metric that has little value: maturity/capability. Assessment can be useful, as several people have said in comments. ITIL maturity assessment in particular isn't useful. I guess one way of paraphrasing my post in terms of your comment is that ITIL assessment is a dumb question for a customer to want answered. It's our job to tell them that.

Itil Process Maturity Framework Pdf Example

Bringing up Lean in this context is a great point.What with all the buzz about lifecycle, updates, more content etc since the evolution from V2 we seem to be forgetting talking about some of the basics on Service Management. In this context it is that you would need to understand the reasons you are doing it in the first place. Everyone remember IT-business alignment? There is not a lot of talk about it nowadays.The ITIL V3 focus word 'value' is better explained in the 2011 edition but it may just muddy the waters and too theoretical still compared to Lean.In essence, you should only do so much of any process until you increase the value it delivers. Any further, even though it may give you a higher maturity score, is a waste and therefore you should not be doing it. How much is different in each organisation depending on many factors - and will be different in each process, too. You do need to do the balancing act between process maturity and the value delivered.Note that I have yet to work for or hear about an organisation that was judged to have reached maturity level 5 in any process.

I have no doubt it happened somewhere to someone. I would bet they are not large IT shops, though.

In theory, it could be. Reminds me of the time I was engaged by a hardware vendor to run an independent TCO Assessment of a company based in Brisbane.Using the old Gartner TCO Manager for Distributed Computing. Thanks for the reference, but. That does not yet settle it for me.

It talks about Boeing rated at level 5 for SW-CMM, in other words, for their software development (and reuse) approach in a specific set of practices. That is a very narrow focus and certainly not the breadth that you see within an ITSM framework. Hi Skep, all fair comments, but I suspect at least half of the problem here is the question asked of the consultant, and indeed who is asking it. Rich,Long time no see.I believe Skep's point is exactly that: the maturity model is not measuring what the company needs, it measures how mature the processes are within the organisation. It does not tell you anything about how much those processes contribute (if at all) to solving the company's problem, or in a better scenario, align to delivering the value the company tries to create.I think the issue is many people believe the process maturity assessment is a measure of how well the organisation is doing. If I want to be overly supportive of ITIL then I would say that such a maturity assessment can only stand a chance of getting near that point is if the scope includes the full lifecycle, more specifically the Strategy aspects (I am assuming the assessment itself is done well).

Simply because ensuring that the processes are integrated, emphasis is given to those that support the company's ultimate direction (i.e. Aligned to corporate strategy) - are all matters for a proper Service strategy. Which, I believe, most IT (Service) organisations don't have, incidentally.So having an assessment looking at an IT organisation's operational processes should not be taken in isolation to mean anything that it does not.

Anybody trying to govern and make decisions about the direction of their IT organisation based on an ITIL MA alone is deficient. Then again, I leave it up to you to decide if it is at least a step further than making decisions based on individual's instincts and beliefs alone. To pick up on your point on the Middle East experience, You only have to go as far as look at most of the typical questions on various ITIL-related LinkedIn groups: a lot of the questions coming from people from regions where IT Management is not as developed as a practice as some of the countries in the forefront indicate a natural lack of experience/understanding in how to effectively govern IT in general. I am not blaming these people for asking a question - it is a natural progress to gradually widen your horizon: and for my money it is a sign of accepting the applicability of external frameworks, approaches, measurement methods and standards that somebody within such an IT organisation goes for an external ITIL MA - as one step on the road to a more rounded approach in the end.Could we develop a better understanding on what MA's are and are not?

On what their place is in the overall scheme of learning 'where we are' to decide 'what do we need to do' to get to 'where we want to be'? Yes, we can, and should. For my money, discussions such as this are helpful in doing that. But I would not discount ITIL MA's as valid tools in the toolbox. Hi Peter, and hope you're well. I understood the point, but I think it's worth pointing out that the question is king here. Of course, consultants have a responsibility to shape that question too, but I know we've both come across many situations where a client is simply looking for independent support of his political aims; that's never going to be stated in the assessment report, of course.In sympathy with Ian's point I've long considered that we ought to be looking for how the outcomes of IT activity, including processes et al, really enable the business - that is, really meet the customer's genuine validated needs.

Of course that's more tricky where they're not defined, and it's less common for consultants to be engaged to answer that particular question for obvious reasons. Where I might depart from Ian slightly is that the answer does necessitate some IT naval gazing; that is, what process/people/technology/supplier factors are resulting in a failing outcome? - which is where ITIL's prime value lies I believe.Again, though, more often or not the question is more constrained than that, precisely because the sort of person that could answer the question from all angles is undoubtedly prohibitively expensive. Hence the tendency to focus on those elements from which the answers might come: that is, the service management related processes and activities. 'Tell me how I might manage this better so that I can improve my customers' experience of and value in my IT services' is a much better question, and one that requires more holistic assessment and roadmapping.I didn't mean to imply that MA tools are worthless, but actually make your point: what are we actually assessing here? Because if all we're doing is scoring against a perceived standard in ITIL - which is a dubious end in itself - then I'd agree we're falling well short.Rich PembertonPS.

There is a lack of maturity in the ME market place, and that's true of the consultants out there too; I can feel a disaster coming.;-). I've been on both sides of the fence. Consulting & working for big IT shops.A maturity assessment as the means & the end is pointless as discussed many times over. But, as part of an overall CSI plan that takes in business benefits, customer perception, risks, business landscape, understanding:. What you've got. What you need. What to do nextIs a good thing but YOU CAN DO IT YOURSELF (.

if you have a reasonable amount of experience and a working bulls#it-meter).I subscribed to an inexpensive on-line tool for capability assessment / maturity assessment / ISO20k compliance.The benefit for me was the question library.It asked a pretty balanced set of questions. ITIL is explicitly, by its own definition, NOT a measurement or even a tool. It is a governance framework based on the selection of best practices. People expect ITIL to solve their problems in an automated fashion. Only the hard work of people can accomplish that, and ITIL provides them with a framework to do that. ITIL it isn't a tool - it is a guide for applying energy in the right places at the right time.

The energy still needs to be spent, still needs to be measured, and still needs to be refined over time. ITIL is not a 'money saver', it is not a 'silver bullet', and it is not just 'certificates and training'.It is insane when people come out of one or many ITIL training courses and think they're done.

That's just the start. Then you have to actually do the heavy-duty analysis and implementation work, the measurement and the continuous improvement.

That still has to be done, and ITIL training doesn't eliminate any of that effort. Instead, hopefully that effort, if you keep on track, produces measurable and positive results in a more efficient (but certainly not optimal) way than what would have otherwise happened, and you'll be able to improve it over time.

Do you need ITIL to do this? Do people fail with ITIL? But ITIL certainly helps the organization that has troubles aligning its service priorities with management distractions and business realities. I knew that ITIL maturity assessments are useless because, well, ITIL told me (ITILV3 CSI, p.96). They are only a snapshot in time and ignore process dynamics and/or cultural issues. They are vendor/framework specific. Improving maturity, as opposed to delivering value, can become its own goal.ITIL maturity assessments provide only a kick in the teeth, but sometimes that's the point.

Internal forces (too politically isolated to make any difference) can have their pet peeves or hunches validated by a third party with more credibility. I also know this because of ITIL.Okay, I know all of the above from personal experience, but the fact that ITIL already told us this years ago makes it, well, not newsworthy. It is still worth reminding ourselves from time to time.On Friday I reviewed a year-old maturity assessment of 4 of our parent company's processes (Incident, Problem, Change, and Config-I have no idea why these in particular). I couldn't help but think of the same issues. All credible insiders could have said the same things-and I know who they are. The recommendations don't really follow from the maturity assessment but from general knowledge of good practices. However, the organization is too big to fail and needed a credible outsider to tell it what it already knew.They used a vendor specific framework.

I cannot help but think that these have gone the way of the dinosaur. Standard frameworks such as CoBIT or ISO20000 are more credible, transparent, and useful. COBIT and ISO20K have different purposes.

A maturity assessment tells you exactly what Skep is on about - how mature your thinking and acting is when compared with that criteria. So whoopee you have achieved a level 3.5 of maturity for a process. How do you suggest that relates to customer satisfaction? I thought everyone knew there is no correlation between process improvement or capability maturity levels and customer satisfaction - none.As I hope you appreciate - ISO20K should be used for conformity assessment as part of gaining the certificate - period. Its binary - conform or not when an audit is involved - no 'maturity' beyond that.Now, assessments of processes can be useful, once you have a reason to perform them that is linked to a customer issue or situation. But when it comes to understanding if and at what level they help satisfying customers and delivering appropriate experiences - they flunk. Because they are inside-out.

'The IT Skeptic™', 'The Skeptical Informer™', 'The IT Swami™', 'Chokey the Chimp™' and 'BOKKED™' are trademarks of Two Hills Ltd.ITIL® is a Registered Trade Mark of AXELOS LimitedPRINCE2® is a Registered Trade Mark of AXELOS LimitedMoR® is a Registered Trade Mark of AXELOS LimitedP3O® is a Registered Trade Mark of AXELOS LimitedMSP® is a Registered Trade Mark of AXELOS LimitedP3M3® is a Registered Trade Mark of AXELOS LimitedMoV® is a Registered Trade Mark of AXELOS LimitedMoP® is a Registered Trade Mark of AXELOS LimitedITIL® is registered in the U.S. Patent and Trademark Office.ITIL Live™ is a trademark of TSO, The Stationery Office.prISM® is a registered trademark of itSMF International Inc.COBIT® is a Registered Trade Mark of the Information Systems Audit and Control Association and the IT Governance Institute.Microsoft® is a Registered Trade Mark of Microsoft Corp.

Itil Process Maturity Framework Pdf

Is the first and only end-to-end service and operations platform that’s integrated with 360-degree intelligence. Built for the cloud, this reimagined service and operations experience is unrivaled, giving you:. optimized for. Enterprise-wide service including IT, HR, Facilities, and Procurement. An omni-channel experience across Slack, Chatbot, Skype, and more. Automation with conversational bots and RPA bots. More than 7,500 IT organizations trust BMC ITSM solutions.COBIT Basicsis a methodology that aims at connecting business goals to IT goals – assigning objectives and duties to both business and IT leaders.

It provides the resources to build, monitor, and improve its implementation, while helping to reduce costs, establish and maintain privacy standards, and give structure and oversight to general IT processes within the company.These resources include:. Frameworks – which help to achieve a balance between benefits and risks.

Process Descriptions. Control Objectives. Management Guidelines. Maturity ModelsThe COBIT framework is based on these five guiding principles:. Meeting Stakeholder Needs (value creation for enterprise stakeholders)Source: COBIT® 5, figure 3. © 2012 ISACA® All rights reserved. Sql commands list with example pdf.

Service

Covering the Enterprise End-to-end (coverage of all corporate processes and functions that relate to information flow and technologies)Source: COBIT® 5, figure 8. © 2012 ISACA® All rights reserved.Source: COBIT® 5, figure 9. © 2012 ISACA® All rights reserved.

Applying a Single Integrated Framework with a single set of standards to be used across the business. Enabling a Holistic Approach among seven categories of enablers as defined by COBIT 5:. Principles, policies and frameworks. Processes. Organizational structures.

Culture, ethics and behavior. Information. Services, infrastructure and applications. People, skills and competenciesSource: COBIT® 5, figure 12. © 2012 ISACA® All rights reserved. Separating Governance From ManagementSource: COBIT® 5, figure 15.

© 2012 ISACA® All rights reserved. ITIL Basicsis a framework that focuses on and enables IT services to be managed across their lifecycle, from mirroring the IT landscape components (configuration items) on a centralized knowledge base to registering their lifecycle (changes, events, and incidents) to managing the evolution of those configuration items (versions, integrations, and so on).

Posted :