We've learned how to take processes and use them to prototype, and think through what we want to build with enterprise software so we don't create things that users don't like. I mentioned, there are three tools that we were going to use to extend this practice of agile into enterprise software a little more seamlessly, a little more effectively. And the last one here is thinking in terms of zero, 30, and 90 day success criteria as we think about how we're going to end our iterations and validate our results. You saw this process in the last video, and now we're going to look at how would we validate that. And so, first we're going to look at this problem scenario alternative proposition trio that you learned about in module two. The problem scenario that they're facing around donor development is implementing their learned best practices so that the new donor development manager can use those. And also keeping those aligned with corporate objectives and making sure that they don't get an overage of BVA of reporting and having the executive director bug the development manager all the time to understand that. Now they've been doing this with a series of spreadsheets on a shared drive, not uncommon and perhaps, okay. Let's not be overly dismissive of those spreadsheets. Sometimes, they're a much better alternative than many types of enterprise software. That's the reality. Automation doesn't immediately make things better. It has to be thoughtful and it has to be useful to the user. Now, our value proposition is that we think that if we're thoughtful about it, the CRM we're gonna use is going to help automate things and help structure them in a way that is better enough than this current alternative where they're going to find it useful. So, how do we actually figure that out? And how do we implement that validation in the normal cadence of getting this project done? Well, one good way is to think about zero, 30 and 90 day success criteria for every feature. So, zero day success criteria is basically usability testing which you'll learn lots about in course two and course four. But basically, they're just going to put the interface that they have for entering leads in front of the development manager and ask them to put in five real prospects and observe whether everything goes into the field that they thought the development manager would use and basically whether the leads go in there okay and development managers able to find and execute this. Now, our 30 day criteria is, okay, well, we've validated that the system is usable. But after 30 days, are they actually using it or not? Now, leaving this stray end out there kind of eats at the certainty that we talked about all of us wanting, and we want it, our managers want it but we can't create certainty where certainly doesn't exist. And if we put the blinders on and we're not validating what we've done, we're just going to end up with an accumulation of things that users hate and we're going to have an I.T. project that's a failure. On the other hand, we don't want to stop and go and make everything really complicated, if we just think about what could we use after 30 days to see if people are actually doing this or not so that in it's kind of infancy period, we're keeping a really nice close eye on it. That's a great way to implement observation into your agile cadences in a way that isn't too invasive and doesn't generate a lot of extra overhead. So, in this case, we can just look at, is the Donor Development Manager logging in at least 18 across 18 working days? Because we think that if there are logging in less than that, it probably means they're not really putting in leads this way. And then, the 90 day success criteria is, is this actually helping us achieve the outcomes, the propositions that we originally envisioned for this, and how we measure that? Now you may say, our projects have a longer cadence than that. We need probably 120, 150, 180 days to do that. That's fine, and that's normal. You have to use your own professional judgment about how that would work, but I would urge you to do it on the sooner side because it's always better to re-evaluate sooner and go on another iteration if you feel like you need more observation than to wait too long to find a result that is going to change your course of action and your focus. So, some examples of how this might work is that, donors and accounts are the same thing here, we want to see 80% of our growth in target segments, meaning that when we plan out a campaign and we're trying to be purposeful, align our development with our corporate objectives, that that's actually happening. And they have typing and ways, reporting R&D instrument into the system to figure that out. And, they also want this other result here that if the growth is not in the target segments that they wanted, can they at least identify why that is in the postmortems that they put on the opportunities? Because maybe, they don't meet their goals for a given campaign and that's not really the processes' fault, but it was just a campaign that wasn't meant to be. It had a charter and an audience that just didn't line up with each other. But, through this automation, this process and to deliver on the propositions that we wanted, we want to be able to understand that through the system. So, we want to see a positive result or a negative result that lets us know what's happening and we would consider that a good outcome for the system. Let's look at another example of this. There is a donor recognition process that we looked at in the second example. And here, the problem scenario is consistently recognizing donors so that the donors know that they appreciate what they do and that they begin to create and maintain nice durable relationships with the donors. The alternative right now is to try to remember to do this, post-it notes, occasionally a spreadsheet if they have a lot of them. It's pretty ad hoc. And, this is an area where automation could be actually really great as we saw in example two. So, the sales force implementation will automate this and where we have a manual process, because we saw that for donations over a certain amount, they want the executive director to send a handwritten note, at least we'll be able to keep an eye on that and not have the development manager bugging the executive director all the time about sending these, and the executive director maybe not having all the inputs on hand that they need do it. So, what are validation criteria here? Well, zero day would be that the D.M input sample opportunities that close and we obviously need some test data for this one because we don't want to send people real donors emails. But, we're able to go through and the corespondents posts as we expect. So basically, they put these things in and the fake donations that should be recognized are recognized. And, the 30 day criteria here would be to have some reporting where they make sure that there are, in fact, emails going out for all the recognized opportunities or tasks and that closed donations end up recognized. And after 90 days, our donors involved or donations up. And also, I didn't write this down here but I probably is important, is the executive director and development manager not having to spend a lot of time on doing this because that's just part of the proposition here. Another thing I like to leave you with as you think about validating things and where you focus is that you start an enterprise software project. A lot of the time the question is, why we're going to do like 10 different things? Which one should we start with? Well, one thing I would think about is making sure that you use these problem scenario alternative proposition trios to kind of think about where the propositions are most compelling and you can really help yourself score wins early by doing that. Wins with enterprise software and this is just very generally speaking, I mean things where you're able to reduce the amount of paperwork or the amount of systems that people use or screens they have to use. Automation, anything that really lends itself to automation is always a good win. Recognizing the donations that we saw as a good example of that. And consolidation or visibility, those are good goals but generally speaking, I think these two will deliver quicker better wins for you. So we looked here at how do we take the process designs that we formulated and how do we instrument the kind of closed loop observation that's so important to making sure that we're driving at something valuable with agile.