As with all methods of regulation, to be effective, a code must be kept up to date and its operation monitored. Codes enjoy advantages over other forms of regulation since their flexibility is compatible with rapid updating to keep pace with technological change. Industry may also be more inclined to accept a negotiated code of practice, which they feel more adequately represents their views, than a legislative measure, notwithstanding the fact that there is bound to be a significant input from industry in formulating technical legislation.
Codes of Practice as a form of self-regulation are not legally binding, defences of negligence must show that all relevant standards and codes of practice have been adhered to. As well as an obvious moral duty, software engineers may have legal responsibilities under statutes such as the Health and Safety at Work Act to ensure that other people are not put at risk by their acts or omissions. Being a member of an organisation increases the likelihood of competence, and those involved in safety-related systems must work to the standards of a competent practitioner, and may be able to show that they have done so.
We strongly agree with the notion that when systems are deemed to be safety-related, a member of an appropriate professional organisation should be supervising the project as declared in the BCS Code of Conduct. By keeping the regulations in general terms, the regulations themselves should never need amendment. The codes of practice can then be updated fairly quickly to keep pace with technology. Our suggested amendment to the directive should be that a reasonable percentage of those who develop safety-related software should be members of a professional organisation. This will allow young software engineers to participate in safety-related development, gain the relevant experience and hence aid to the development of the industry. It will be necessary for the courts to decide what denotes a reasonable percentage.
Computer programs frustrate the laws traditional categories; they exhibit characteristics of both concrete property and abstract knowledge. A contract for bespoke software is one for work and materials, and as such falls under the Supply of Goods and Services Act 1982. This implies that only software must be written with reasonable skill and care – an extremely vague notion. There is no guarantee that software is bug-free only that the number of bugs is no more than a reasonable person would regard as satisfactory. The formality of the necessary procedures tend to inhibit the laws attempt to keep pace with technological advancement.
Demonstrable Competence
For reasons mentioned earlier in the report, this is a clear characteristic that is necessary for software engineer that creates safety related systems and hence the directive is supported. The challenge lies in how we determine demonstrable competence. The characteristics that apply to this are:
- Quality of education, including the quality of educational institution, the grades obtained, the particular subject studied.
- Skills mastered, including specialised knowledge of particular software packages.
- Employment history, including the kinds of jobs performed, how well they were performed, how easily the individual works with others, the quality of the software firms for which the individual has worked.
- Personal characteristics, including personality traits, leadership ability, and communication skills
Further thought could be given with regard to how these factors could be determined, but is likely to be through examination of the individuals Curriculum Vitae and letters of reference. This situation is already being implemented by invisible-hand style forces in the safety-related industry in the UK today, but there is only a tacit assumption that this is the case. Issuing a directive will aid in the enforcement of this significant factor.
Chartered Status
Chartered Engineers are inclined to be people with the background knowledge experience and professionalism the computing community would expect of a software engineer. However is not just software engineers that are involved in developing a safety-related system; there is a wide breadth of people concerned. The suggestion that all involved should be Chartered Engineers is an unreasonable notion as it would mean that in some cases achieving status becomes a more important priority than developing safe software.
The first reason for this is that the time required to update the examinations does not match the rapid rate of technology change in safety-related software engineering. The examination would most likely be out of date most of the time unless the examination update cycle time could be reduced. Furthermore, Chartered Engineer status is valid for life. This raises the issue of whether a software engineer who received a chartered status in 1970 and has not updated his or her knowledge and skills since then is still qualified to create safety-related software.
Making it compulsory for all those involved with the development software for safety-related applications to be Chartered Engineers is not the ultimate solution. If this part of the directive was enforced, there would be a detrimental effect on the development of the safety-related software industry. Less safety-related software would be produced and it is arguable as to whether it will be of a higher standard.
Use of Standards
Standards tend to have the effect of freezing the technology at a particular point in time. This has the unfortunate implication innovative design methods, which may arise from improvements in technology, may not comply with the standard. Relatively little scientific evaluation of software engineering techniques has been done and it will be difficult to get consensus on what should be included and what should be excluded. In the current rapidly evolving environment of software we are not yet ready to clasp hold of one specific standard and say this is the way forward. Although design standards are easier to implement and enforce there is a lot to be said of performance standards such as the ISO9000. ISO 9000 is a generic management system standard that concerns processes not products. The way in which a safety-related software house manages its processes is obviously going to affect its final product. It is going to affect whether or not everything has been done to ensure that their product meets the customer’s requirements. But standards need to evolve with the industry and this is part of the reason why we agree that attendance at regular professional updates on techniques and standards is a worthwhile directive. Working closely with UKAS, the United Kingdom Accreditation Service is likely to be an important factor. If standards are not adhered to, it will be the responsibility of the software engineer to stand up in court, explain the departure from the standard, and demonstrate that what was done achieved an equivalent or higher standard. Standards are expected to both enhance safety and have economic advantages
Economic Implications
The UK is set to overtake Germany as the largest IT market in Europe, but slow growth in spending is increasing US domination of the industry.
About 40% of the European economy is affected by IT. If we spend less, the productivity gap between Europe and the US will also increase. It is really bad for Europe not to do something.
Charles Homs, Senior Analyst at Forrester Research
If we provide unreasonable bureaucracy on safety-related issues then the inevitable is for us to suffer as an economy as software developers will not produce the same amount of software. Since there is no uniform, accurate or practical approach to predicting and measuring software reliability at present, there is no reason why we should force safety-related developers to adhere to standards. Although it is clear that people do not go into business to be safe; it is a sad state of affairs that software companies, especially in times of recession, tend to think of safety as an afterthought. Society needs educating of the fact that safety can actually be an economically viable activity, not a resource swallowing chore.
There is nothing like an accident for motivating people to adopt safe working practices. The problem lies in changing attitudes before catastrophe occurs. The period following an accident is a golden opportunity for instigating changes, when even the most indifferent person is likely to be more receptive. History tells us that people do not respond well to overburdening bureaucracy and are more susceptible to a more subtle approach. To implement this we need to build a safety culture that emanates from each stage of the software development process. There is no reason to treat software in a different way from other objectives, such as productivity and profit.
People are more interested in returns on investment because of an lethargic attitude toward safety. The same indifference that was identified by Robens in his report, whose poignant words still ring true today
Apathy is the greatest single contributing factor to accidents at work. This attitude will not be cured so long as people are encouraged to think that health and safety at work can be ensured by an ever-expanding body of legal regulations enforced by an ever-increased army of inspectors.
Conclusion
Having said what we have about the proposition of these directives. Not responding to change would have a negative effect on industry. As demonstrated by the Sigmoid curves shown in the Appendix. The safety-related software industry is still developing and the current directives are enough to guide the industry for a while until it matures. Safety issues should be central at all stages of the development process, not an afterthought and it seems likely that in the future only accredited people will be able to work in the safety-related area. The challenge however, lies deeper than the ostensible regulation issue; what is needed is a cultural attitude change to safety.
The apathy identified by Robens in 1970 is still present in our contemporary society and is not going to be solved by the introduction of an EC directive as human attitudes are notoriously hard to change. A comprehensive approach with many mechanisms to improve software coupled with a cultural attitude that values quality and instils individual responsibility is needed to ensure acceptable safety of software-intensive engineered systems. There are no simple and universal fixes that will solve the problem of ensuring public safety. A vigilant safety culture is what we must strive for.
Appendix – The Sigmoid curve
The curve describes the normal lifecycle of a period of learning or investment, in which inputs exceed outputs, followed by a steady growth that inevitably one days peaks and turns into decline. The only way to prolong the life of the body in question is to start a second curve. But to allow time and resources for the initial period of learning and investment, that second curve has to start before the first one peaks.
Applying this to the safety-related software industry can show us that although investment in safety can seem to be a restrictive paradigm, thinking long term suggests that it is a worthwhile benefit. We can se that the only way to prolong the life of the industry is to start a second curve and begin investing into safety.
Bibliography
Internet Resources:
Bennet, Paul E.
Safety By Design
Available: http://www.amleth.demon.co.uk/library/papers/forth/safedesn/safedesn.htm
19th February 2003.
Ericson II, Clifton A.
Software Safety in a Nutshell.
Available: http://www.dcs.gla.ac.uk/~johnson/teaching/safety/reports/Clif_Ericson1.htm
17th February 2003.
Falla, Mike.
Advances in Safety Critical Systems.
Available: http://www.comp.lancs.ac.uk/computing/resources/scs/
12th February 2003.
Knight et al.
On Licensing of Software Engineers Working on Safety-Critical Software.
Final Report of an ACM task force.
Available: http://www.acm.org/serving/se_policy/safety_critical.pdf
6th January 2003.
Kowlenko, Kathy.
New U.S. Licensure Exam Focuses on Practice Areas.
Available: http://www.spectrum.ieee.org/INST/may02/fnew.html
19th February 2003.
Health and Safety Commission.
Health and Safety Statistics Highlights 2001/2002.
Available: http://www.hse.gov.uk/statistics/overall/hssh0102.pdf
19th February 2003.
International Organisation for Standardisation
Generic Management System Standards
Available: http://www.iso.ch/iso/en/iso9000-14000/tour/generic.html
17th February 2003
Mustapha M. Aziz.
Software Safety: Liability and Practice
Available: http://www.ex.ac.uk/~mmaziz/com1409/lect9.pdf
19th February 2003
Orzech, Dan..
When It's Life or Death, Some 911 Systems Turn To Linux.
Available: http://boston.internet.com/news/article.php/1497311.
12th February 2003.
Rowland, Diane.
Cyberspace: a world apart?
Available: http://www.bileta.ac.uk/98papers/rowland.html
12th February 2003
Printed Resources:
Ayres, Robert. The Essence of Professional Issues in Computing. Prentice Hall, 1998.
Bott, Frank et al. Professional Issues in Software Engineering. Taylor & Francis, 2001
Handy, Charles. The Elephant and the Flea. Arrow, 2002.
Scheinholtz Michael. Software Safety. Carnegie Mellon University, 1998.