ArchiMate® 3 Part 2 Exam Questions and Answers
Please read this scenario prior to answering the question
ArchiSurance has decided to leverage its financial expertise by offering defined contribution retirement plans. Each trading day, ArchiSurance submits consolidated mutual fund trading transactions to a stock exchange on behalf of its retirement
plan participants.
The daily mutual fund trading cycle consists of four key processes: Transaction capture, pricing, trading and reconciliation. Transaction capture consists of two sub-processes: manual exchange and loans and distributions (L&D). For transaction
capture, retirement plan participants use an online account management application to enter manual fund exchange transactions. For L&D, plan participants use a separate application to enter requests. The L&D application determines whether
the request can be fulfilled based on the mutual fund balances held in each plan balances and a setof business rules. Each day's captured manual exchange transactions accumulate in a transaction database.
ArchiSurance contracts with a third-party information service to receive a file of mutual fund prices at the close of each trading day. The pricing application uses this file to convert captured transaction into trades, and then validates each trade
against the mutual fund balances held in each plan. The pricing application generates a trade file with the minimum number of trades necessary. The trading application sends this file to an external trading service. When the trading application
receives a confirmation file back from the trading service, it causes the reconciliation application to update the plan recordkeeping database.
The account management and L&D applications are hosted on separate application server clusters. Each cluster is a physically separate host that runs application server software on a set of virtualized hosts. All of these applications use a
database server infrastructure that is hosted on another separate cluster of virtualized servers also on a dedicated physical host. The pricing, consolidation, trading and reconciliation applications, however, are batch applications that run on the
ArchiSurance mainframe computer. All application hosts are connected via a converged data center network (DCN), which also connects them to a storage area network (SAN) as well as a wide area network (WAN) that is used to communicate
with the external trading service. The SAN includes two physically separate storage arrays, one of which holds data for all databases, and another that holds data for all files.
Refer to the Scenario
The systems analysts would like to better understand the business processes and applications for daily fund trading. You have been asked to describe the business processes and sub-processes, the applications that they use, the data objects
accessed by those applications, and the external application services that access some of those data objects.
Which of the following is the best answer? Note that you are not required to model the business actors/roles.
Please read this scenario prior to answering the question
The IT Operations (IT Ops) department at ArchiSurance has five core responsibilities, each encompassing a dedicated business process: (1) Batch Operations (Batch Ops), (2) Online Operations (Online Ops), (3) Security Operations (Security
Ops), (4) User Support and (5) Continuous Improvement. Service level agreements (SLAs) are in place for Batch Ops and Online Ops, and each Ops process generates monitoring data that is utilized by the Continuous Improvement process.
The System Ops category consists of Batch Ops, Online Ops, and Security Ops, each having an incident management sub-process. These sub-processes are triggered by Batch, Online, and Security Incidents, respectively. In the initial stages of
the incident management sub-processes, an Incident Alert is shared with the other System Ops processes by posting it to the Alert Buffer.
Batch Ops relies on a schedule that outlines all batch jobs and their dependencies. This schedule serves two sub-processes: Batch Planning, which updates the schedule for use by the Execution Management sub-process.
The Batch Ops process relies on a suite of interconnected applications to facilitate its operations. Among these applications, the Batch Scheduler plays a vital role by allowing users to manage a comprehensive database of jobs, job schedules, and
dependencies. It effectively launches batch jobs according to the information stored in the database.
Working in conjunction with the Batch Scheduler, the Batch Monitor application utilizes the job schedules as a reference point to monitor job execution. It identifies any exceptional conditions that may arise during the execution process. To ensure
effective handling of these exceptions, the Batch Monitor communicates the information to both the Batch Scheduler and the Incident Handler applications through the previously mentioned Alert Buffer.
The Incident Handler application operates based on a defined set of business rules. It uses these rules to determine the relevant systems and individuals that need to be notified in the event of each incident. Subsequently, the Incident Handler
application generates appropriate notifications according to these determinations.
Recognizing the criticality of the Batch Scheduler, Batch Monitor, and Incident Handler applications, ArchiSurance has implemented redundant hosting arrangements across multiple geographically distributed data centers. In each data center,
these three applications are supported by fully redundant virtual server clusters. Each cluster is connected to two site local area networks, both of which are further linked to separate storage array hardware devices.
Refer to the scenario
As part of an IT service management initiative, you have been assigned the task to show how applications and technology support the Batch Ops process. This should show the relationships between the applications, their functions, the data
they access, and the technology that hosts the applications and data, along with the networks that connect the servers. It is only necessary to model a single data center.
Which of the following answers provides the most complete and accurate model?