Durham University Iterative Model Professional Praxis Paper

In the following continuation of the Professional Praxis, you take part in the process specification/flow/engineering stage, by stepping out of the project perspective and assessing how all of the pieces of process (or how you accomplish things) fit together.

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

Pat’s status message on the instant messenger reads, “‘A foolish consistency is the hobgoblin of little minds.’ – Ralph Waldo Emerson”

Thinking about this, you IM Pat saying, “Hey! Your status message made me think about our project. How do we know if the processes that we use as a software engineers are not a foolish consistency, repeating processes that are ineffective?”

Pat responds, “I agree. We do need to think about that. Now that we have switched to the iterative model, let’s do some process specification/flow/engineering. Process specification, flow, and engineering will help us evaluate and ultimately remedy process issues, improving cost, work flow, and the final product. Here’s a diagram showing how process engineering works:” The diagram is attached below,

You respond, “I’m on it. I’ll specify our process through a diagram. I’ll also reexamine and verify content, format, and outputs of the processes. And, if they don’t match, I’ll make recommendations to make them match. I’ll also figure out which metrics need to be used to assess the overall process.”

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

To prepare:

  • Read “Sketching Out BPM” focusing on a method of processspecification/flow/engineering.

Create a process diagram that links all of the processes (from the previous Assignment which is attached below) through their inputs and outputs. Reexamine and verify that the outputs of one process have the appropriate content and format for the next process step. If they do not match, revise them accordingly. Write a 3- to 4-page paper that explains and provides rationale for the changes made to the processes in order to ensure consistency with the inputs and outputs. The paper must also identify and include rationale for the metrics that will be used to assess the overall process. Support your rationale using this unit’s resources and or your own research. Use proper APA format and citation. In the appendix of your paper, insert the process diagram.

1
The iterative model has proven to be more effective compared to the waterfall model. However,
this does not mean that setting up projects using the iterative approach is easier. Project
managers often find it very difficult to use an iterative approach during its initial stages, when
there are high risks and chances of failure of high. Waterfall models follow a linear process of
analysis, design, coding, and testing, with less feedback on the previous processes. The main
problem with the waterfall model is that risk is pushed forward to the next phases compared to
fixing them in their initial phases (Gharajeh, 2019). Discovering defects during the late stages of
a project makes it very costly to complete the project or project cancellation since the waterfall
approach masks risks till the late stages when nothing meaningful can be done.
The iterative approach works much better as it forces risks to be identified easily during the
initial stages of a project. This makes it possible to identify risks early and resolve them
efficiently and promptly. The iterative model is advantageous to the waterfall approach in the
following ways:

Risks are identified early making it easy to rectify them.

It allows users to provide feedback at an early stage minimizing making changes later on
during the project’s lifecycle.

The development team is forced to resolve project issues early on, making it easy for
team members to focus on important matters of the project.

Iterative testing makes it easy to assess projects’ progress and objective.

Project stakeholders are provided concrete evidence about the project’s progress status.

Inconsistencies between the project’s requirements and implementations get detected
early.
2
Process of Iterative Model
Risk mitigation in the iterative model.
During the integration process, risks are easily identified and get addressed in the
iterative process. During project roll out all the process components are checked using tools
to perceive possible risks, making it easy to discover unsuspecting risks. Thus, a project that
was meant to fail fails during its initial stages, before a lot of costs and time are spent.
Iterative allows easy confrontation of risks and mitigates them early on (Alshamrani &
Bahattab, 2015). The most common risks are architectural and integration risks. Problems are
detected early before the project moves to further phases.
3
Project changes
Iterative model works somehow different from the waterfall model, since it is cyclical unlike
waterfall which focuses step-by-step process development. After the project gets initiated
most of the stages are constantly repeated till the previous stages improves and increments
the next phase.
Planning is the first stage which involves mapping out project’s requirements and
specifications and preparing for the next phases of the project.
Analysis and design: after planning is conducted, analyses are carried out to lay down the
database models and business logics appropriate for the successful of the project (Okesola et
al., 2020, July). Technical requirements are also established during the design phase, which
will be utilized to meet the projects requirements identified in the analysis phase.
Implementation: it now allows the actual coding of the project to commence. Specifications
and the projects design is coded and implemented during this stage.
Testing: occurs after coding is completed and implemented, to allow a series of testing to
allow easy identification of bugs or issues that could have been ignored.
Evaluation: it occurs after all previous stages have been completed, evaluation is done to
examine the entire project. Clients are also allowed to share their views about the project and
suggesting necessary recommendations where they think changes need to be made.
4
References
Alshamrani, A., & Bahattab, A. (2015). A comparison between three SDLC models waterfall
model, spiral model, and Incremental/Iterative model. International Journal of Computer
Science Issues (IJCSI), 12(1), 106.
Gharajeh, M. S. (2019). Waterative model: An integration of the waterfall and iterative
software development paradigms. Database Syst. J, 10, 75-81.
Okesola, O. J., Adebiyi, A. A., Owoade, A. A., Adeaga, O., Adeyemi, O., & Odun-Ayo, I.
(2020, July). Software Requirement in Iterative SDLC Model. In Computer Science On-line
Conference (pp. 26-34). Springer, Cham.
26
Sketching Out
_ __
system, you can detach business
logic from infrastructure to create useful applications
more quickly than ever before
HE BENEFITS CLAIMED BY BPM (BUSINESS PROCESS MANAGEMENT) SOFTWARE
sound almost too good to be true. Proponents crow about lower app dev costs,
shorter time to market, improved compliance enforcement, and new points of
leverage for optimizing business performance.
HPM softwiirc (an’l improve anything by itself, of course — but il laii bv a powerful weapon
when combined with business-oriented documentation and analysis. Within its own controiled.
high-level app dev environment, BPM wraps IT solution development within husiness-driven
modeling and performance measurement.
At the [east, iiPM provides aji effective new medium through which the business side can communicate its requirenients to IT. At best, it can distill functionality from existing applications and
free business logic from the bonds of existing infrastructure to enable unprecedented agility.
One persistent problem for potential adopters, however, has been abject confusion. BPM solutions come in so many varieties that only a handful of consultants seem to know which solution
is best for the task at hand.
Today, clarity is emerging in the form of the BPM suite, an integrated set of tools and runtime
components designed to create soihvare analogs ofbusiness processes. Together, these elements
allow customers to model, deploy, and monitor BPM systems without having lo staple together
bits and pieces of technology from different vendors.
Properly utilized, BPM suites account for the fact that tbe internals of elemental process activities — particularly those implemented by existing business systems — may be difficult to
BY
BRUCE
SILVER
I L L U S T R A T I O N
B Y KATE
M c K E O N
INFOWOHLD.COM
02.80.06
27
The Four Phases of BPM
A’ta fii^K level, the’flow for^devglopittg BPM so’tLftlons resembles that of iany otHer”app dev
cvr.is^. Rut BPlyi’s core features – croDhical modeling, automatf-d aop’ication generation, and
ion with legacy applieatI.
• mucii trrister tin)’
Performance management
– Dashboards
-BAM
– Analytics
Analytical
process model
– Rowchart
– Resource model
-KPIs
– Simulation analysis
Auto-generated design
– Flow
– Process data
– Skeleton integration
Business
IT
EA modeling
– Data model
– Components
Compiete executable design
– Detailed integraticn
• Data transformantions
– Exception handling
– Ul design
Code reusable
components
(services)
niodity. Instead, BPM suites enable
rr to optimize business performance
by changing the process logic that
interconnects them. Process design
in a BPM suite is akin lo a Howchart,
annotated with the necessary implementation detail. !t requires little
code, and the process logic is easily
changed, qualifying BPM as a style of
agile application development.
The Basic BPM Row
BPM begins with process modeling, a
business-driven exercise in which current and proposed process flows are
documented in detail, linked to quantifiable performance metrics, and opti28 1
INFOWORLCCOM
02.20.06
mized throngh simulation analysis.
These optimized models automatically generate the skeleton of the
IT implementation in a BPM suite’s
process designer, a graphical development tool that integrates bimian
workflow, application integration,
and business rules to create an executable process solution. Completed
process designs are then deployed to
the process engine and other components of the BPM suite runtime,
where they route and track tasks,
integrate with external business systems, and enforce business rules.
As process instances complete each
activity, the process engine gener-
ates an event to mark the
occasion. Those events are
collected by the BPM suite’s
performance management
component, which aggregates tliem into metrics
that measure business
performance.
Performance management dashboards graph
metrics versus their target values, with drill-down
analytics via OLAP queries. They also provide realtime alerts and automated
escalation procedures when
KPIs (key performance indicators) go off track, a
capability of tlie B/\M (business activity monitoring)
component often bundled
with a BPM suite. Actual
performance data can be
fed back to refine process
models and he^ixi a new
cycle of incremental process
improvement.
Process Wars
Tally all that functionality,
and you end up with quite a stack:
software for business modeling, simulation analysis, human workflow,
application integration, data mapping, business rules, performance
analytics. BAM. and Web portals. All
of these originated as independent
tools from specialized vendors.
But today, within the BPM world,
the trend is toward wrapping all these
components inside the BPM suite,
whether through mergers and acquisitions, OEM, or integration partnerships. This shift bas created conflicts
between BPM suite vendors and suppliers of modeling tools, BAM, and integration middleware, eacb of wbich
1
Big-vendor solutions that emphasize
BPEL work best for composing Web services into
apps that involve iittle human workflow.
tends to describe BPM in its own way.
Perhajjs the greatest source of confusion, however, has derived from the
two competing technical architectures for BPM. The one that has gotten
Uie most media attention is based on
the BPEL (Business Process Execution
Language) standard, wbich implements processes by orchestrating Web
services within an SOA environment.
This is where the large infrastructure
vendors play, including IBM. Microsoft. Oracle, and SAP.
On the other hand, most pure-play
BPM suite vendors — such as Fuego,
FileNet, Pegasystems, and Sawion —
use an architecture tbat evolved from
the workflow systems of Ibe 1990s, one
better suited to incorporating human
tasks in the process model. In these
offerings, SOA and BPEL play a more
limited role and focus on application
integration, rather than describing the
end-to-end process.
The bottom line is fairly simple. The
big-vendor solutions that emphasize
BPEL work liest for composing Web
services into applications that involve
little human workflow — tbat is, without multistage bandoffs to various users in various roles in an organization.
Pure-piays have long emphasized im-
plementation without programming,
so their BPM solui ions tend to provide
the straightesi line to a practical BPM
deployment. The downside is that, as
opposed to their big-vendor competitors, pure-plays’ offerings can be more
difficult to integrate into an existing
application environment.
Modeling Reality
Whether from a hig vendor or a
pure-play, modeling tools have one
main purpose: to describe business
processes in terms of their elemental activities and tasks, the resources
required to perform each task, and
CASE STUDY
Modeling Employee Background Checks
STERLING TESTING SYSTEMS NEVER ACTUALLY CALLED THE
solution they came up wkh BPM (business process management) until after the fact, says Paul Mladineo. vice president of
strategic deveiopmenr.
Bui Mladineo and his team, headed up by CTO Michael
Richardson, certainly understood rhe challenges faced by their
connpany, which specializes in pre-employment screening and
background checks. They ultimately chose a BPM system from
Fuego for the task,
“The data we collect is a commodity, not proprietary,” Mladineo says. The task, then, was to differentiate the company’s services from the competition, which taps the same information.
Because the data is publicly available. Mladineo knew that
the data quality, delivery, and services wrapped around the
information was how they could achieve this. However. Sterling
faced one additional challenge: Managing and mapping unique
customized services for 4,000 customers was not what Mladineo
called “commercially efficient,”
What was needed was a BPM solution that could model
the sourcing processes and reuse process components where
applicable — and then employ logical branching in order to accommodate a wide variety of clients and services, “Employment
verification for day care centers is quite different than licensing
for driving a tractor trailer,” Mladineo says, and yet the process of
employment verification itself has some common attributes.
The IT goal was to expose meaningful results to clients with
30
INFOWOHbD.COM
02.20.06
many different formats and kick off alerts, in some cases using
XML messaging, and to do it on a scale that could accommodate Sterling’s numerous clients,
“With this repository of process components, we can build,
configure, and test processes, and it doesn’t require hardcoding,” Richardson says.
The process started with flowcharting the “as-is” business
processes, which gave them the opportunity to automate and
reuse pieces of the processes that were similar and then to use
branching to accommodate customization.
“It is one thing to do that with a Word document that is sent
around, but another thing entirely to translate that to a logical
infrastructure running on servers,” Richardson says.
The ultimate goal is to cut the amount of time it takes to complete a typical piece of work by 25 to 40 percent. Of course, there
is always a disconnect between design and execution,
“Using Fuego, we were able to create the connections between
the theoretical process and execution by having a common visualization tool used by both the technical folks and business folks,”
Richardson says.
Is it a roaring success? Mladineo says it is still early in production, and they don’t have a full set of data back, but the
mere process of modeling the current system has brought to
the surface inefficiencies In the production level, “The rigor
required to do it with a tool forced a lot of intemal discussion.’
— Ephraim Schwartz
What They’re Using
tn 3 January 200G IDC survey, four offerings led ttie pack when
respondents involved In the acquisition of BPM software were asked
which solutions were currently in use.
IBM WebSphere Process Server
Adobe Workflow or Process Manager
Microsoft Biztalk Server
BEAWeblogic
the business rules interconnecting
them — all using a graphical notation
understandable to business users.
Models play a critical role in aligning process design with quantifiable
performance objectives and optimizing expected results through simulation analysis. By annotating each
process activity with performancerelated parameters such as expected
lime to perform, resource costs, avail-
ability, and branching ratios at forks in
theflowpath, models can be analyzed
in a variety of scenarios using a simulation engine built hito the modeling
tool. In advanced modeling tools, the
KPIs that will be used to measure the
performance of the process implementation determine the parameters
required in the model, ultimately closing the loop of performance improvement. This requires models that go
beyond simple descriptions of activity flow to include modeling of organizational resources, process data, and
process performance metrics.
For years, these capabilities have
been the exclusive domain of business
process modeling tools from vendors
such as Casewise. IDS Scheer. Popkin
(now Telelogici. and Proforma, often
as part of a hroader suite of enterprise
architecture tools. Now. however, vendors such as Global 360. IBM, and Sav\ion are implementing the process
modeling fimctions of these tools
within the BPM suite itself. At the same
time, modeling vendors iu-e improving
interoperability among BPM suites by
leveraging BPMN (business process
modeling no(ation), a staaidardized
graphical notation from the Object
Management Group.
STUDY
Evaluating Anti-Terror Technology
BPM EVEN HAS A PLACE IN THE WAR ON TERROR, ACCORD-
ing CO Indy Crowley, research staff member and acting
lead for IT at the Institute for Defense Analysis (IDA), an
organization chat evaluates technology under the SAFETY
(Support Anti-Terrorism by Fostering Effective Technologies) Act of 2002.
In support of the SAFETY Act, companies submit technologies and services to IDA for evaluation. At any one time, IDA
might be juggling SO different “applications.” as they call the
products or services. Each application, in turn, may involve as
many as 60 different steps before a final evaluation is made and
sent on to Homeland Security.
Thechallengewas to know the status of aii applications
under evaluation throughout its lifecycle at IDA in order to
respond to internal and external inquiries.
“Before using Appian. everything was done on spreadsheets and paper,” Crowley says. And it tied up specialists
whose sole job was to track the documents throughout
the lifecycie.
Crowley says the Appian system formalized a series of tasks or
processes. He created a prototype to model the current process-
32
INrOWORLD.COM
02.20.06
es, which was used to discover why applications were so hard to
track. Then, as they used Appian, processes were modified.
“We took out steps in the process that were there simply to
tell us it was there,” Crowley says.
While the system is in its initial stages and there have been
improvements, Crowley judged the results “mixed.”
OneoflDA’sgoals was to be able to adjust processes frequently as needs arose, and to change or retrograde applications that were already under review. The current system does
not do that easily because Its business processes are complex.
long-running, and hard to modify.
“We are looking for the next version of the product, which
will allow us to break the model into small subsections
so we can go back and change the processes under way,”
Crowley says.
The product has allowed IDA to use fewer people to track
where things are, and applications are not held up for lack of
knowing where they are.
“Now, the process is documented. When questions come up
about why it is taking so long, there is a basis to figure out how
we can make changes and where,” Crowley says. — E.S.
When asked’ what business process sets their company was automating
with BPM, a majority of qualified respondents in a January 2006 IDC
survey identified compliance and customer service.
JH.
Compliance process sets such as
financial, manufacturing, etc.
Customer sen/Ice call to close
Daily to annual accounting closings
Workforce management from
recruitment to retirement
Ofdef to cash
Where once the output of process
modeling was a business-oriented
specification intended to guide IT in
any implementation efforts that might
be needed, BPM assumes an automated process implementation will be
executed on the process engine. Modeling standards such as BPMN and interchange formats such as CIF allow
the output of a modeling tool to he imported into a BPM suite’s design tool
and a skeleton implementation design
to be generated. This skeleton design
lacks the implementation detail to be
executable off the bat. hut it creates a
business-specified starting point.
The Closed-Loop Probiem
Even with standard BPM design languages such as BPEL. each vendor’s
process design tool is specific to its
own runtime environment. Today,
there is no such thing as a portable
process design I hat can he executed
on your choice of process engines
— unless, of course, you consider human tasks, business rules, and complex data mapping to he “external” to
the business process design.
Most BPM suites loday provide a
unified design environment that hides
the complexity of combining human
workflow, ajjplication integration,
business rules, and transaction management within a single executable
34 1
INFOWOKLD.COM
02.20.06
design. The henefits this provides over
treating these process components as
independent entities in the enterprise
architecture stack are a common data
model and common state management
over the entire end-to-end process.
Like modeling, process design is
mostly graphical. The too) provides
a palette of activity types from which
the designer selects, configures, and
assembles the process steps. Unless
custom activities need to be created,
process design involves minimal
programming. Behind the graphical
design metaphor, the tool creates an
executahle process implementation
in the BPM suite’s particular process
execution language.
In BPM suites based on workflow
architecture, the language is typically proprietary but compliant witli
the XPDL (XML Process nefinition
Language) standard from the Workflow Management Coalition. Process
activities may be one of several predefined implementation types (Web
service, user task, integration activity), each assigned to a resource, such
as a human task role or an integration
adapter. The configuration dialog for
each activity depends on its type.
By contrast, BPM suites based on
service orchestration rely on the
BPEL langtiage standard. BPEL provides a single activity type — Invoke
— to call a Web service, human task,
or integration adapter, all of which
must be implemented a.s a service,
with an interface described by WSDL.
But Invoke must be addressed to a service end point — a URI., not a task role.
To accommodate human tasks, what
gets invoked by BPEL is not the user
task itself but a task manager service,
which handles the workflow details.
Another difference is thut wiirkflowbased BPM suites support the notion
of a subprocess, a reusable process
fragment that shares context data and
state management with the calling parent. BPEL provides no such concept. A
subprocess is another BPEL process;
data sharing and state synchronization must be explicitly defined in the
process logic. To address these realworld limitations, IBM and SAP last
summer outlined optional extensions
to the BPHL standard, hut the specs are
not yet complete. In the end, regardless of architecture and coding differences. BPM suites tend to accomplish
the same set of core functions.
Roiling Out Process-Driven Apps
The completed process design is then
deployed to the process engine. As
each instance of the process is triggered, the engine routes it through the
defined sequence of activities, integrating external applications, routing
workflow tasks to human participants,
and managing deadlines and exceptions throughout the process. In offerings from app server vendors such
as IBM, Microsoft, Oracle, or SAP, the
process engine leverages unique capahilities of the app ser\’er and its associated middleware. Offerings from BPM
pure-plays tend to run on the user’s
choice of app server platforms.
The process engine also reports snapshots of instance data and state, usually
Each BPM offering is actually
tuned to the requirements of a fairly narrow
set of process types or use cases.
ill the form of events, for the purposes
of performance management. The performance managemenl component of
the BPM suite collects tho.se events
and uses them to update KPTs and
other performance metrics defined in
the modeling phase. Typically, metrics
are aggregated in OLAP cubes, which
ean be charted and queried by users
in management dashboards. OLAJ’based performance management provides historical and “near-real-time”
repordng and drill-down analytics, as
updates are performed on demand by
re-crunchini; the collected data set.
Some BPM suites, including those offered by Adobe, FileNet. IBM, Intalio,
and Sawion. support true BAM. providing real-time update of selected
KPIs with rule-triggered alerts and escalation actions.
Metrics computed from running
processes can be used to refine the
model parameters used to generate
expected values of those metrics, making Ihe effect of process changes more
predictable and stimulating additional
rounds of process improvement.
The BPM Choice
Picking the right BPM suite is a real
challenge. Although virtually all the
vendors promote the same list of capabilities in their hrochures and Web
sites, eacb offering is actually tuned to
the requirements of a fairly narrow set
of process types or use cases.
For example, a BPM suite designed
for transactional. “straight through”
processes involving complex application integration but little human
interaction might not be tbe best
cboice for collaborative, human-centric processes with minimal integration. Document-centric processes and
production workflow processes where
pools of users draw tasks from shared
queues at high speed have their own
unique requirements addressed by
some, but not all. BPM suites.
Sucb complications aside. BPM is
offering real return on investment
to users totlay. When presented as
an architectural stack, it can sound
like a tangled mess. But the new
generation of integrated BPM suites
is untangling BPM and providing a
new middle ground for business-IT
collahoration. r^
Bruce Silver is an independent analyst
and author of The 2006 BPMS Report,
a free download from the BPM Institute
(infoworldxoinji87’i).
CASE STUDY
Building a Workflow for Insurance Reps
WITH MORE THAN S3 BILLION IN SALES AND MORE THAN 2.3
from a workstation. A transaction is launched and comes
back to a clipboard behind thescenesof the workflow. When
a customer call comes in, Pegasystems taps into the various
customer files and history and sends the information up to the
CSR, “Depending on what the customer asks for. different business rules are enacted that take the CSR down different paths,”
Kirkham says.
million customers, 100-year-old American National Insurance
Company (ANIC) has quire a few legacy systems in operation.
The greatest challenge for ANIC was around customer service. “Customers wanted information about the relationship
between us and them,” says Gary Kirkham, vice president and
director of planning and support at ANIC.
Although CSRs (customer service reps) went to great lengths
to satisfy a query, it often would mean a handling time of 10 or
11 minutes, as reps drilled down into one system for a piece of
the answer, logged out, and went into the next system.
“The reps had to have enough talk that was meaningful while
they worked between systems.” Kirkham says.
ANIC wanted one interface with all che data immediately
accessible to the CSRs. When Kirkham started the project in
the ’90s, the process was called “technology-enabled selling.”
he recalls, and it was only after Gartner redefined the market
in early 2000 as BPM chat the name came to mean something more.
The original goal was to put the CSR in a position to help the
customer as quickly as possible. As that improves, ANIC has a
secondary goal of optimizing and automating processes.
When an insured customer dies, for example, it triggers an
entire set of processes that used to be done manually. “Once
Pegasystems knows an insured has expired, it processes automatically all the things three people used to do,” Kirkham says.
The results for ANIC have been quite dramatic. Kirkham
credits the new system with increasing sales in its annuity
division from $750 million to S2.2 billion two years in a row.
It does this by helping ANIC’s 80 independent brokers differentiate among callers by which ones produce the most sales,
After the processes have been modeled, Pegasystems is used
to log on to all of ANIC’s legacy systems when a request comes
“Through our business rule, we can service one of these customers usually within 15 seconds.” he says. — £.S,
INFOWOHLD.COM
02,20.06
1 35

Order a unique copy of this paper

600 words
We'll send you the first draft for approval by September 11, 2018 at 10:52 AM
Total price:
$26
Top Academic Writers Ready to Help
with Your Research Proposal

Order your essay today and save 25% with the discount code GREEN