• 'Software Testing Foundations' Notes

    randomly copied from: Software Testing Foundations, 4th Edition

    The Fundamental Test Process

    Test Planing and Control

    Planning of the test process starts at the beginning of the software development project. The mission and objectives of testing must be defined and agreeed upon as well as the resources necessary for the test process.

    The main task of planning is to determine the test strategy or approach. since an exhaustive test is not possible, priorities must be set based on risk assessment. The test activities must be distributed to the individual subsystems, denpending on the expected risk and the sverity of failure effects.

    Test Analysis and Design

    The first task is to review the test basis, i.e., the specification of what should be tested. the specification should be concrete and clear enough to develop test cases. the basis for the creation of a test can be the specification or architecure documents.

    it is important to ensure traceability between the specifications to be tested and the tests themselves. it must be clear which test cases test which requirement and vice versa. Only this way is it possible to decide which requirements are to be or have been tested, how intensively and with which test cases.Even the traceability of requirement changes to the test cases and vice versa should be verified.

    Test Implementation and Execution

    Tests must be run and logged. the priority of the test cases decided during planning.

    Test Evaluation and Reporting

    During test evaluation and reporting, the test object is assessed against the set test exit criteria specified during planning. this ma result in normal termination of the tests if all criteria are met, or it may be decided that additional test cases should be run or that the criteria weree too hard. it must be decided whether the test exit criteria defined in the test plan are fulfilled.

    Test Closure Activities

    the following data should be recorded:

    • when wwas the software system released?
    • when was the test finished or terminated?
    • when was a milestone reached or a maintenane release completed? Importeant informantion for evaluation can be extracted by asking the following questions:
    • which planned results are achieved and when – if at all?
    • which unexpected events happened (reasons and how they were met)?
    • are there any open problems? and change requests? why ere thye not implemented?
    • how was user acceptance after deploying the system?

    General Principles of Testing

    • Testing shows the presence of defects, not their absence. Testing can show that the product fails, cannot prove that a program is defect free. even if no failures are found during testing, this is no proof that there are no defects.
    • Exhaustive tesing is impossible it’s impossible to run an exhaustive test that includes all possible values.
    • Testing activities should start as early as possible
    • Defect clustering. if many defects are detected in one place, there are normally more defects nearby.
    • The pesticide paradox. new and modified cases should be developed and added to the test.
    • Testing is context dependent.
    • No failures means the system is useful is a fallacy.

    Test Plan

    The test manager might participate in the following planning activities:

    • defininingthe overall approach to and strategy for testing
    • deciding about the test environment and test auotomation
    • defining the test level and their interaction and integrating the testing activities with other project activities
    • deciding how to evaluate the test results
    • selecting matrics for monitoring and controlling test work, as well as defining test exit criteria
    • determining how much test documentation shall be prepared and determining templates
    • writing the test plan and deciding on what, who, when and how much testing
    • estimating test effor and test cost.

    Test Entry and Exit Criteria

    typial entry criteria:

    • the test environment is ready
    • the test tools are ready for use in the test environment
    • test objects are installed in the test environment
    • the necessary test data is available


    • achieved test coverage: tests run, covered requirements, code coverage etc.
    • product quality: defect density, defect severity, failure rate and reliability of the test object
    • residual risk: tests not executed, defects not repaired, incomplete coverage of requirements or code. ect.
    • economic constraints: allowed cost, project risks, release deadlines and market chance.

    Test Plan according to IEEE 829-1998

    • Test Plan Identifier
    • Introduction
    • Test Items
    • Features to be Tested
    • Features not to be Tested
    • Approach
    • Item Pass/Fail Criteria (exit criteria)
    • Suspension Criteria and Resumption Requirements
    • Test Deliverables
    • Testing Tasks
    • Environmental Needs
    • Staffing and Training Needs
    • Schedule
    • Risks and Contingencies
    • Approvals
  • npm versioning

    npm versioning


    [major, minor, patch]


    • fist release: 1.0.0
    • bug fix, minor change -> 1.0.1, 1.0.2
    • non-breaking new features -> 1.1.0
    • breaking changes -> 2.0.0

    Semver for Consumers

    update dependencies based on semantic versions

    npm install:

    npm will look at the dependencies that are listed in that file and download the latest versions, using semantic versioning. (https://docs.npmjs.com/getting-started/using-a-package.json#managing-dependency-versions)

    yarn upgrade:

    Upgrades packages to their latest version based on the specified range. (https://yarnpkg.com/lang/en/docs/cli/upgrade/)

    prerelease tags and ranges

    based on the following check:

    const semver = require('semver')
    > semver.gt('1.1.1-rc.1', '1.1.0')
    > semver.gt('1.1.2-rc.1', '1.1.1-rc.1')
    > semver.gt('1.1.2-rc.2', '1.1.2-rc.1')

    I would assume that given a list of versions: ['1.1.0', '1.1.1-rc.3', '1.1.1-rc.2', '1.1.1-rc.1', '1.1.2-rc.1', '1.1.2-rc.2'':

    • >1.0 would return 1.1.2-rc.2
    • >1.1.1-rc.2 would return 1.1.2-rc.2

    the acutal result is:

     > semver.maxSatisfying(['1.1.0', '1.1.1-rc.3', '1.1.1-rc.2', '1.1.1-rc.1', '1.1.2-rc.1', '1.1.2-rc.2'], '>1.0')
    > semver.maxSatisfying(['1.1.0', '1.1.1-rc.3', '1.1.1-rc.2', '1.1.1-rc.1', '1.1.2-rc.1', '1.1.2-rc.2'], '>1.1.1-rc.2')
    // add 1.2.0 to the version list
    > semver.maxSatisfying(['1.1.0', '1.1.1-rc.3', '1.1.1-rc.2', '1.1.1-rc.1', '1.1.2-rc.1', '1.1.2-rc.2', '1.2.0'], '>1.1.1-rc.2')

    possiable to get the latest pre release based on a versin range? the answer is No.


    If a version has a prerelease tag (for example, 1.2.3-alpha.3) then it will only be allowed to satisfy comparator sets if at least one comparator with the same [major, minor, patch] tuple also has a prerelease tag.

    For example, the range >1.2.3-alpha.3 would be allowed to match the version 1.2.3-alpha.7, but it would not be satisfied by 3.4.5-alpha.9, even though 3.4.5-alpha.9 is technically “greater than” 1.2.3-alpha.3 according to the SemVer sort rules. The version range only accepts prerelease tags on the 1.2.3 version. The version 3.4.5 would satisfy the range, because it does not have a prerelease flag, and 3.4.5 is greater than 1.2.3-alpha.7.

    The purpose for this behavior is twofold. First, prerelease versions frequently are updated very quickly, and contain many breaking changes that are (by the author’s design) not yet fit for public consumption. Therefore, by default, they are excluded from range matching semantics.

    Second, a user who has opted into using a prerelease version has clearly indicated the intent to use that specific set of alpha/beta/rc versions. By including a prerelease tag in the range, the user is indicating that they are aware of the risk. However, it is still not appropriate to assume that they have opted into taking a similar risk on the next set of prerelease versions.

    semver.js source code: https://github.com/npm/node-semver/blob/v5.5.0/semver.js#L1121

    how to get the latest pre-release verions?

    dist-tag: https://docs.npmjs.com/cli/dist-tag

    Publishing a package sets the latest tag to the published version unless the --tag option is used. For example, npm publish --tag=beta.
    By default, npm install <pkg> (without any @<version> or @<tag> specifier) installs the latest tag.

    so, to release a new pre-release version:

    npm publish --tag next

    to use the latest pre-release verion:

    npm install <package>@next

    to verify:

    npm dist-tag ls -a <package>
  • Ember.js notes

    install ember

    npm install -g ember-cli

    new project

    ember new devops-dashboard
    cd devops-dashboard
    ember serve 

    new route

    ember generate route projects
    # update index template application.hbs
    <h1>DevOps Dashboard</h1>
    # update project.hbs:
    # update projects.js to provide modeling:
    import Route from '@ember/routing/route';
    export default Route.extend({
        model() {
            return ['java-project-A', 'python-project-B', 'emberjs-project-C'];

    now we get a /projects page contains a list of projects.

    setup default route

    now I want to set the /projects as the index:

    # vi routes/index.js
    import Route from '@ember/routing/route';
    // https://guides.emberjs.com/release/routing/redirection/#toc_transitioning-before-the-model-is-known
    export default Route.extend({
      beforeModel(/* transition */) {
        this.transitionTo('projects'); // Implicitly aborts the on-going transition.

    fetch data from http as modeling

    # use github/python projects as modeling
    # projects.js
    import Route from '@ember/routing/route';
    export default Route.extend({
        model() {
            // return ['java-project-A', 'python-project-B', 'emberjs-project-C'];
            return $.getJSON("https://api.github.com/orgs/python/repos");
    # show project names in projects.hbs

    now our app lists all projects belongs to github/python.

    create sub-route for project details

    ember generate route projects/view
    # add link from projects to proejct/view
    # projects.hbs
            <li>  </li>
    # route.js to accpet parameter from link
    Router.map(function() {
      this.route('projects', function() {
        this.route('view', {path: '/:id'});
    # view.js to collect parameters 
    import Route from '@ember/routing/route';
    export default Route.extend({
        model(params) {
            // Ember.Logger.log("params: " + params.id);
            return params.id;
    # view.hbs to display
    <h3>current project: </h3>

    stop rendering parents outlet

    Each template will be rendered into the of its parent route’s template. https://guides.emberjs.com/release/routing/rendering-a-template/

    we don’t really need to display the projects list in projects/view, so: https://stackoverflow.com/questions/32160056/ember-how-not-to-render-parents-template

    ember generate route projects/index
    // migrate code from projects.js/projects.hbs to projects/index.js(.hbs)

    add bootstrap

    ember install ember-bootstrap
    // then restart ember server

    add static file

    mkdir public/assets
    vi public/assets/data/projects.json
    // use in js:
     // return $.getJSON("https://api.github.com/orgs/python/repos");
      return $.getJSON("/assets/data/projects.json");
  • Installing and configuring Java/Maven environment in Mac


    installing jdk

    download and install jdk from: http://www.oracle.com/technetwork/java/javase/downloads/index.html

    mutiple JDK can be installed: e.g. java 1.8 and java 10.

    check installed jdk

    run man java_home to know how:

    The java_home command returns a path suitable for setting the JAVA_HOME environment variable. It determines this path from the user’s enabled and preferred JVMs in the Java Preferences application. Additional constraints may be provided to filter the list of JVMs available. By default, if no constraints match the available list of JVMs, the default order is used. The path is printed to standard output.

    bash-3.2$ /usr/libexec/java_home -V
    Matching Java Virtual Machines (2):
        10.0.2, x86_64:	"Java SE 10.0.2"	/Library/Java/JavaVirtualMachines/jdk-10.0.2.jdk/Contents/Home
        1.8.0_45, x86_64:	"Java SE 8"	/Library/Java/JavaVirtualMachines/jdk1.8.0_45.jdk/Contents/Home
    bash-3.2$ /usr/libexec/java_home -v 1.8 --exec java -version
    java version "1.8.0_45"
    Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
    Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
    bash-3.2$ /usr/libexec/java_home -v 10 --exec jshell
    |  Welcome to JShell -- Version 10.0.2
    |  For an introduction type: /help intro
    jshell> /exit
    |  Goodbye
    bash-3.2$ export JAVA_HOME=`/usr/libexec/java_home -v 1.8`
    bash-3.2$ java -version
    java version "1.8.0_45"
    Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
    Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

    config java_home

    according to man java_home: set java_home in ~/.bash_profile:

    export JAVA_HOME=`/usr/libexec/java_home`

    check with java is in use actaully

    ls -al /usr/bin/java

    removing java

    cd /Library/Java/JavaVirtualMachines
    sudo rm -rf jdk-10.0.2.jdk
    # verify
    /usr/libexec/java_home -V


    installing / configuring maven

    download and unzip maven, then set m2_home in ~/.bash_profile:

    export M2_HOME=$tools/current-maven
    export PATH=$M2_HOME/bin:$PATH

    check maven settup

    mvn -v will print the maven home and java home. if java home is incorrect, verify your environment details with your $M2_HOME/bin/mvn.

  • What I Learned from a Performance Testing

    I recently joined a new team as a do-everything engineer. The team is working hard to push a newly web app to production. the app enables existing users buy products which are provides by an external vendor. the app relies on existing authentication services, payment services, order services, etc.

    After spending couple weeks with an existing ‘Performance testing’ team, I finally get the test ‘approved’. I learned couple things from this process.

    if a downstream service is not available for testing, mock it.

    or else the performance test won’t happen at all. given the following diagram, it’s almost impossible to get all dependencies ready for my testing, e.g:

    • UserService has a testing envrionment, but it took days to request a test user.
    • PaymentGateway requires UserInfo ready and all test data will be created manually — an account could run out of money in the middle of a performance test.

    agreed with core stockholders, I performed the test with: the app itself + mocked (existing) internal services + real external services.

    +-----+         +-----------------------+              +-------------+ +-------------------------+ +-----------------------+                   +-----------------+ +-----------------------+
    | App |         | AuthenticationService |              | UserService | | ExternalProductService  | | InternalOrderService  |                   | PaymentGateway  | | ExternalOrderService  |
    +-----+         +-----------------------+              +-------------+ +-------------------------+ +-----------------------+                   +-----------------+ +-----------------------+
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       | user login             |                                 |                     |                          |                                        |                      |
       |----------------------->|                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        | user info verify request        |                     |                          |                                        |                      |
       |                        |-------------------------------->|                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                        verified |                     |                          |                                        |                      |
       |                        |<--------------------------------|                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                 tokens |                                 |                     |                          |                                        |                      |
       |<-----------------------|                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       | get user info          |                                 |                     |                          |                                        |                      |
       |--------------------------------------------------------->|                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                       user info |                     |                          |                                        |                      |
       |<---------------------------------------------------------|                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       | fetch products         |                                 |                     |                          |                                        |                      |
       |------------------------------------------------------------------------------->|                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |            products |                          |                                        |                      |
       |<-------------------------------------------------------------------------------|                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       | user place order       |                                 |                     |                          |                                        |                      |
       |---------------------------------------------------------------------------------------------------------->|                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          | verify toke / payment request          |                      |
       |                        |                                 |                     |                          |--------------------------------------->|                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                      payment processed |                      |
       |                        |                                 |                     |                          |<---------------------------------------|                      |
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          | place order                            |                      |
       |                        |                                 |                     |                          |-------------------------------------------------------------->|
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |                          |                                        |   order confirmation |
       |                        |                                 |                     |                          |<--------------------------------------------------------------|
       |                        |                                 |                     |                          |                                        |                      |
       |                        |                                 |                     |       order confirmation |                                        |                      |
       |<----------------------------------------------------------------------------------------------------------|                                        |                      |
       |                        |                                 |                     |                          |                                        |                      |

    Performance testing should be conducted from Day 1

    during the performance testing, one slowness issue was detected: it took quite long time to get users info from the mocked UserService. the mocked UserService does very straightforward job: returning a fixed user info form a local server. so it’s not caused by UserServie / network, the issue must be caused the app itself. I reviewed the sourcecode, an unnecessary synchronization was applied to the servlet. it was added in the first commit half year ago, the feedback loop for this piece of code is: 6 months.

    if we run PT as part of the DevOps pipeline, this issue could be identified and fixed in the fist PT.

    define / review the performance requirements with stockholders as early as possible

    System designing should consider performance requirements. designing a web app for 20 internal users could be very different from designing a web app for 2000 clients. performance concerns should be highlighted earlier, such as external services, proxy servers. if developers are aware that the app need to support 20 concurrent users, probably they will not simply add a synchronized to a serverlet.

    Performance testing should be automated

    it took me couple days to help the performance testing team understanding the app, then they spend couple days to prepare their test cases. and if there’s any change to the app. I need to go through the whole process with them again. this does not make sense to me: as an agile team, we’re moving fast, but if it took couple days(even weeks) to test, the test result is not valid since the test finished — more changes already applied to the app. same as other tests, performance test should be automated. it should be executed by a machine, the test should be able to tell that it’s passed or not automatically.

subscribe via RSS