[trustable-software] Trustable Software Workflow

Andrew Banks andrew at andrewbanks.com
Sun Jan 8 10:49:35 UTC 2017

Without delving into the details of this proposal, I am concerned HUGELY that it seems to imply that Software Engineering is just about writing code...

It misses the obvious steps of:
* Requirement Capture
* Design (at differing levels)
* Treats testing as an afterthought...


-----Original Message-----
From: trustable-software [mailto:trustable-software-bounces at lists.trustable.io] On Behalf Of trustable at panic.fluff.org
Sent: 07 January 2017 11:35
To: Trustable software engineering discussion
Subject: Re: [trustable-software] Trustable Software Workflow

I feel this trustable model is missing a number of important concepts.
Let me begin with the actors associated with this workflow.

* There are no clear definitions of the roles of the actos

   The act of 'developing' a piece of code, is explicitily seperated
   from the act of 'requirement specification'

   The act of 'project initiation' which sets the 'desires' of the
   business for the provision of the development process is missing.

   The act of 'audit' for the compliance with standards and of the produced
   reports is missing

For example, the PID 'Project Initiation Documentation' may say building me a car, while the process of definition of the requirements may say provide me a 'vehicle motion' which during development, raises a requirement for it to 'enable braking'. The emergent requirements often are discovered by the developer, but need implementing by some other role to ensure clear segration of duties.

The ability to record this information and the meta data associated with them is missing from this workflow.

* Standard encoding

   Standards today are rarely explicit about their requirement, but often state that you provide evidence of compliance with them and there is a discussion between the standard auditor.

   The most MAJOR issue today with standards compliance is there isn't a standard markup for the reports produced and presented in the delivery of a compliant system. Today it is almost impossible to compare two systems for compliance. It turns out to be the a word document or a power point deck.. There is no XBRL tagging of the data generated for compliance.

   In this existing workflow there is confusion over the use of the 'developer' marking up the standards and the act of creating the code.

   In my experience though the developer may be aware of the standards the evidence needed to be generated to show compliance is something the developer has no experience of. In the PCI-DSS case this role belongs to the Qualified Security Assessors. (QSA) I'm sure there are equivalents in the various secure development standards and this role is missing.

* Role Seperations

   In my experience of development for BASEL III and PCI-DSS, it is frequently posisble to partake in many roles during the development cycle, For example. a developer on one requirement, may review any other patch set but one in which they developed, or perhap specified the requirements

   There are some roles which can only be performed by those people who may be involved in going to prison for failure to comply with the deliver or require particularly certifications to act on a transaction.

   There is nothing in the model which makes this clear or looks the issues of recording this.

* Compliance Verification

   Opencontrol.XYZ suggests that we can encode the standards in a way that we can see how they change. But how do we report on the compliance and confirm we have the data and meta-data necessary to report on them

   Is this a dashboard real-time report ? Is this a post-commit document build as part of continuous integration ? Are these report successful executiong of particular tests.

   For me, I think following the OWASP approach of executing a particularly series of tests against the code base and providing a dashboad report against the standard.

  However, the OWASP tool implementation, like many static code analysis tools has issues with its trustworthiness and timelyness of reporting compliance with standards and you may have to build others on top of this.

* Continuous Compliance

   It is my firm opinion that it is corruption of lost of the meta-data associated with the delivery of a trustable solution should be spotted as early as possible.

   In alignment with the principles of continuos integration, the failure to provide this data is like failing to pass a test and further action is required. Catching these early failures of data shouldm stop furthe development until it is inplace.

   This is not discussed at all and where the constraints should be placed

* Display style

   I found the delivery of the diagram for this approach less helpful than it might have been. I would particularly like these discussion delivered in the form of Sequence or SwimLane diagrams


   As I feel particularly the ordering of the tasks and the conditional boundaries between these tasks are the most important features of any such delivery.

Edmund J. Sutcliffe                     Thoughtful Solutions; Creatively
<edmunds at panic.fluff.org>               Implemented and Communicated
<http://panic.fluff.org>                +44 (0) 7976 938841

trustable-software mailing list
trustable-software at lists.trustable.io

This email has been checked for viruses by Avast antivirus software.

More information about the trustable-software mailing list