By Support


The purpose of the assignment: Produce a proposal for a Chief Product Office (CPO) that outlines the main system requirements specification and design/ implementation of an IS solution in a project.

The assignment is divided into 3 parts.

Part 1:

  • A brief literature review detailing the ICT requirements for a Project Management Information System, and select an organisation to explore this
  • Conduct a brief analysis of your chosen organisation’s business
  • Based on your analysis, propose a web-based project management tool or a Portal design for managing projects in your chosen
  • Provide justification for your chosen

Word limit : 1000-1200 words (excluding references and appendix)

Part 2:

  • Generate a Requirement specification Report (see RSR Template) for the proposed IS System which must include
    • suitable workflows
    • performance metric
    • Functional and non-functional requirements of the system such as availability, security, infrastructure
  • Must contain business continuity plan, suitable performance metrics and KPIs
  • Discuss acquisition processes of the proposed IS System using some concepts such as SDLC

Word limit: 1500-1800 words (excluding references and appendix)

Part 3:

  • Create and execute a test plan of the proposed system
  • Recommendations on how to effectively utilize the proposed IS System and plans on business continuity

Word limit: 1500-1800 words (excluding references and appendix)

Brief demystification of the test plan

    A test plan depicts a document that describes the scope, resources, activities performed, the functionality delivered, and the schedule for the testing activities (Munassar & Goverdhan, 2010).  Similarly, it establishes the test items, the testing tasks and the persons responsible for each task, the features to be tested, and the likely challenges and solutions (Lewis, 2016).  This implies that these elements are indispensable in any particular system test plan.

   The test plan will be structured in the following format as outlined below.

  • Testing goals
  • Testing process
  • Roles and Responsibilities
  • Environment

Testing goals

  • Purpose

The scope of the test will fundamentally cover the functional and non-functional aspects of the complete system.  However, the non-functional aspects of the system will be immensely covered relative to the functional aspects.  The non-functional requirement depicts those needed to have the system correctly functioning, including security, backup, recovery, and concurrency.

There are several chief expectations from this system test.  Nevertheless, the most paramount ones include ensuring the satisfactory operation of the system and providing the management with the decision-making support required in planning, controlling, and monitoring the project (Myers, Sandler, & Badgett, 2011).

  • Testing deliverables

The data that will be reported include that on the security, recovery, backup, and concurrency status of the system.  This is because they are the primary test aspects in the system.  Essentially, the testing team leader will deliver the conclusive testing results in a printed report and present it to the top management.  This is after prudent, holistic, and comprehensive testing.

Testing process

  • Tests to be run

The chief types of tests that will be run in this plan include security, backup, recovery, and concurrency testing.  Each of these will be handled comprehensively based on their features (Agarwal, Tayal, & Gupta, 2010).

Security testing

It is chiefly meant to protect the database from any possible unauthorized internal or external access (Hsu & Huang, 2011).  Essentially, this testing is aimed at establishing if the information system protects data and keeps the intended functionality.  This testing will validate six fundamental principles: integrity, confidentiality, authorization, authentication, non-repudiation, and availability.

Backup testing

This will seek to verify the capacity to back up the system.  It involves replicating data and files in the system and storing them for future retrieval in case data is lost either intentionally or unintentionally (Hsu & Huang, 2011).  This will be chief in the plan since it is the aspect that is crucial and central to the system.

Recovery testing

It will verify the restoration of the system`s database to a condition known to be accurate following a failure that had rendered it unreliable (Buse & Zimmermann, 2012).  It will evaluate the system`s ability to reliably and efficaciously retrieve data or files that may have been lost during its operation.

Concurrency testing

This plan will seek to ensure that side-by-side processes do not obstruct each other.  Such simultaneous processes include updates and queries.  Similarly, it will seek to detect any defects in the system when multiple users are logged in.  This will ensure the system can properly run simultaneous processes without breaking down.

  • Testing time schedule

The testing will require substantial time to ensure it is comprehensive and accurate (McGregor, 2010).  Since the system is a huge organization, the most suitable time schedule will be approximately 18 hours.  This time frame will cover all the indispensable testing aspects of the system.

  • Assumptions

There are several central assumptions that have been made on this particular testing.  According to (Davis & Venkatesh, 2004), they include the following:

  • The indispensable testing tools and access to the tools are possible and will be offered to the test team
  • There will be a workable test environment available for the test team to carry out the testing
  • The provided test environment will be similar to the production environment in both the software and hardware configurations
  • The resources for the test team, including the roles and responsibilities, are accessible and dedicated to the test team
  • Prerequisites

There will be two essential tests that must be conducted prior to this test.  These tests include system or integration testing and performance testing. System testing is meant to ensure that all the components of the system are commendably functional and harmonious (Raymond & Bergeron, 2008).  On the other hand, performance testing will ensure that the hardware and software components of the system are performing well.

  • Criteria for success

For this testing to be fruitful, it must meet some particular threshold.  Essentially, this testing should ensure that the four testing is above par; the system can run for over a year without facing problems.

  • What to do when the test is successful

The system will be closely monitored, adjusted, and perpetually improved periodically.

  • What to do when the test fails

The system will be subjected to further development regarding those testing aspects.  This will mainly involve the system developers.

  • Roles and responsibilities

The hierarchy in the system testing team will involve two levels, namely:

  • Test manager
  • Test engineers

                The roles and responsibilities of the test manager are (Schwalbe, 2015):

  1. Defining the testing tasks and activities for the subordinates, the test engineers or the testers.
  2. Checking whether the team has all the indispensable resources to carry out the testing activities
  3. Checking whether the testing is satisfactorily corresponding to the software development in the entire phases
  4. Prepare the status report testing activities
  5. Periodically updating the project manager about the progress of the testing activities

The roles and responsibilities of the test engineers are (Schwalbe, 2015):

  1. Reading all the documents and comprehending what needs to be tested
  2. Deciding how the testing will be done based on the information in the above step
  3. Informing the test lead regarding what resources will be needed for the system testing
  4. Develop the necessary test cases and prioritize test activities
  5. Carry out the entire test case and report any defects
  • Environment

Setting up the appropriate test environment is highly necessary to ensure the success of the system testing.  Any flaws at this level can be expensive in terms of additional costs and time (Peppard & Ward, 2016).

The needed hardware equipment

The required hardware in this system testing includes computers, scanners, handhelds, special printers, network cables, and relevant documentation like installation guides, configuration manuals, reference documents, and user manuals.

The required software 

The specific applications needed include Excel, Word, an operating system, a browser, SPSS, R, and a network.

Environmental and system security elements

It is evidently obvious that the test environment needs to be secure to ensure that everything in the testing plan goes smoothly and with minimum, if any, challenges in matters of security (Fenton & Bieman, 2014).  Nevertheless, before setting up our test environment, we will ensure that the test team has comprehensively, thoroughly, and adequately understood the test requirements for the system.  Some of the main security elements will include:

  1. Checking for the required hardware and software licenses
  2. Ascertaining the software versions
  3. Verifying the hardware and software configurations
  4. Browser firewalls
  5. Passwords


   The present functionality of the IS System in the organization is quite commendable. However, there need to be some measures to be put in place to polish up the productivity and the effectiveness of the system.  As such, the following are the suggested measures:

  • The organization should facilitate the system admin in ensuring that the system runs with minimal interruptions.
  • The system should incorporate all the other relevant departments of the organization.
  • Regular updates should be done on the system.
  • The users of the system in the organization should be adequately trained and inducted into its operations
  • There should be optimal use of the system’s abilities and features


  • Agarwal, B., Tayal, P., & Gupta, M. (2010). Software engineering and testing. Jones & Bartlett Learning.
  • Buse, P., & Zimmermann, T. (2012). Information needs for software development analytics. In Proceedings of the 34th international conference on software engineering. IEEE Press, 987-996.
  • Davis, D., & Venkatesh, V. (2004). Toward pre-prototype user acceptance testing of new information systems: implications for software project management. IEEE Transactions on Engineering Management, 31-46.
  • Fenton, N., & Bieman, J. (2014). Software metrics: a rigorous and practical approach. CRC Press.
  • Hsu, J., & Huang, Y. (2011). An adaptive reliability analysis using path testing for complex component-based software systems. IEEE Transactions on Reliability, 158-170.
  • Lewis, E. (2016). Software testing and continuous quality improvement. CRC Press.
  • McGregor, D. (2010). Testing a software product line. In Testing Techniques in Software Engineering. Springer, 104-140.
  • Munassar, A., & Goverdhan, A. (2010). A comparison between five models of software modeling. IJSCI International Journal of Computer Science Issues, 94-101.
  • Myers, J., Sandler, C., & Badgett, T. (2011). The art of software testing. John Wiley & Sons.
  • Peppard, J., & Ward, J. (2016). The strategic management of information systems: Building a digital strategy. John Wiley & Sons.
  • Raymond, L., & Bergeron, F. (2008). Project management information systems: An empirical study of their impact on project managers and project success. International Journal of Project Management, 213-220.
  • Schwalbe, K. (2015). Information technology project management. Cengage Publishing.