Global Best Practices

Introduction

Following these global best practices enables App Creators to increase the trustability of their Apps by App Users, by demonstrably reducing the potential for risks and harms to App Users arising from the use of their Apps by themselves or others. They cover App technology, operation and governance.

These global best practices do this:

  • by creating opportunities for App Users, third parties and the general public to find, raise and get resolved potential risks, harms or other issues for technology, operation and governance that App Creators did not find or resolve themselvesm, for whatever reasons;

  • by reducing opportunities for App Creators, or individual insiders among App Creator organizations to inadvertently or intentionally deceive App Users.

As a result, App Users can have higher confidence in the trustworthiness of Apps whose App Creators follow these best practices, over Apps whose App Creators that do not.

This is a draft list, and expected to evolve over time.

Potential risks or harms addressed

Specifically, we consider the following scenarios:

  1. App Creators commit inadvertent mistakes that can expose App Users to avoidable risks or harms (“MISTAKES”)

    • Example: because of a typo in the source code, an App is not performing an access control check on an API, as a result of which third parties can download certain App data without being authorized.

    • Example: an App Developer implements an algorithm that has a significant known security vulnerability which, however, is not known to the App Developer.

    • Example: communication between App Creators' engineering and marketing groups is insufficient and some information about the App published on the App’s official website is simply not true or not true any more as the App’s development has continued.

  2. App Creators use sub-standard architectures, data structures, algorithms, and the like, or apply sub-standard workmanship when creating, operating or governing the App (“SUBSTANDARD”)

    • Example: a Contact Tracing App requires App Users to provide their full name, address and social security number, although alternate approaches to implementing Contact Tracing based on Pseudonymity are well-known and could have been used instead, creating unnecessary privacy risks.

    • Example: coding of an App is sloppy or uses “spaghetti code”, which makes it hard for both reviewers and the App Developers themselves to be certain what the code actually does under all circumstances, thereby creating additional risks.

  3. App Creators deliberately deceive App Users (“DECEPTION”)

    • Example: an App contains undeclared spyware.
  4. Information published to App Users by App Creators is misleading (“MISLEADING”)

    • Example: the App portrays itself as (only) performing one type of functionality (e.g. contact tracing), but it also silently contains significant other functionality (e.g. it collects App Users' location history).
  5. While the App Creator organization(s) themselves do not intend to deceive App Users, some individuals in an App Creator organization perform an insider attack (“INSIDER”)

    • Example: a software developer adds “backdoor code” to the shipped App in order to spy on his ex-girlfriend, without the rest of the App Creator team knowing.

    • Example: a systems administrator creates additional backup tapes of confidential information and makes those available to third parties unbeknownst to the rest of the App Creator team, or to App Users, for example because they have been bribed or blackmailed.

Best practices: technology

  1. The App implements “best-of-class” architectures, data structures, algorithms and communication protocols.

    For example, some architectures are inherently more privacy-preserving (on-device data vs on-cloud data), as are some data structures (e.g. pseudonymous vs fully identified users).

    See the section on implementation choices and their pros and cons.

    Addresses:

    • SUBSTANDARD, e.g. unnecessary risks due to sloppy, insecure, or other substandard workmanship;

    • MISLEADING, e.g. the intentional use of weaker security than available.

  2. The full source code is available for review by third parties, such as by being published to the general public on a website such as Github. The published code must include all relevant components (e.g. Smartphone Component as well as Cloud Component, needed libraries in the correct versions etc).

    For our purposes here, it is not necessary that the source code be licensed using a FOSS license; only that it is available for review, building, instrumenting and running for the purposes of evaluating its trustworthiness.

    Addresses:

    • MISTAKES, e.g. bugs in the code;

    • SUBSTANDARD, e.g. because developers fear for their reputation if they are associated with sloppy work;

    • DECEPTION, e.g. allows that the code does not contain hidden backdoors;

    • MISLEADING, e.g. the published code works as advertised;

  3. All dependencies follow the same best practices as the main App. Those dependencies include static code dependencies (e.g. libraries), dynamic code dependencies (e.g. code loaded at run-time), on-line services (e.g. hosted analytics services), hosting services (e.g. IaaS platform) etc.

    This is necessary because otherwise, while the App itself may be trustworthy and low risk, App Users are at risk from included components by other developers.

  4. Complete build instructions are available and can be easily followed by reviewers of the source code.

    Addresses:

    • MISTAKES, e.g. a submodule’s source code was accidentally not published and thus could not be reviewed;

    • SUBSTANDARD, e.g. unreliable build process leading to inconclusive testing results;

  5. The App can be built with a Reproducible Build.

    This enables reviewers to verify that the published, downloadable version of the App is identical to the version whose source code has been published for review.

    Addresses:

    • DECEPTION, e.g. a submodule’s source code was intentionally not published because it contained spyware that the App Creators intended to keep hidden;

    • INSIDER, e.g. because it would show that a rogue engineer modified/added/removed code during the release process.

    Note: unfortunately, the technical foundations of Reproducible Builds are still in their infancy, and even well-meaning App Creators may not be able to employ the technique (yet) in some circumstances, e.g. on iOS.

  6. The App’s architecture, key data structures, key algorithms and communication protocols are publicly documented.

    This aids reviewers in understanding the published code, in performing black-box testing, and makes it easier to find issues.

  7. Unit and system tests are documented, available, and easily executable by reviewers.

    This enables reviewers to more easily understand the reviewed code. It also enables them to add tests to verify that additional scenarios (e.g. edge cases) also behave as expected.

  8. A security/risk analysis is publicly available that documents known risks or potential vulnerabilities, their potential impact, and measures taken by the App Creators to mitigate them.

  9. If the App includes any form of advertising, the full privacy impact of advertising in the App is clearly explained.

Best practices: operations

  1. All published information is being kept up-to-date, including documentation of technology, operations and governance.

    Out-of-date information prevents the detection of all kinds of issues for longer than necessary.

  2. The App development is performed in the open. This includes:

    • a public issue tracker, which contains the known open issues, planned enhancements, discussion of such open issues and enhancements, as well as a history of closed issues and their disposition;

    • the ability for App Users and the general public to review and contribute to the public issue tracker;

    • public commit history of changes to the code base and tests;

    • ability for reviewers and the general public to access and test pre-release versions of the App;

  3. All issues raised in the issue tracker are promptly and sufficiently addressed in public by the App Creators, regardless of who raised them. This is best done within the issue tracker itself.

  4. All known attempted attacks are publicly documented, whether they were successful or not. This includes all types of Attacks, including security breaches, Re-identification attacks, Data Poisoning etc. Impact of such attempted or successful attacks on App Users and others is clearly documented, as are the steps that were performed to recover.

  5. Independent review is encouraged. This includes review of any or all aspects of technology, process and governance.

    This may go all the way to App Creators funding audits by independent, qualified third parties, or establishing an oversight board with members of the stakeholder community, including verifying that the financial results relating to the App are consistent with its declared business model. Depending on the members of the independent auditors and their trustworthiness, more or less detailed information should be released by them to the public.

  6. The day-to-day operations and their status are publicly documented.

    This includes App development activities such as check-in’s, and operational activities such as deletion of data that has reached the end of its retention period.

  7. Critical process steps require a second person. For example, software commits may require sign-off from a second developer who is jointly responsible with the committer for the change.

  8. All contributors are vetted, if they have a meaningful role in development, operations or governance of the App.

  9. The approach to protecting critical data is documented and followed strictly. For security reasons, this need not be public.

    This includes items such as:

    • how backups are secured (e.g. encryption, storage)
    • how root keys are secured (e.g. background checks for authorized employees, physical security, use of threshold cryptography)
    • how monitoring in the field is performed

    and others, depending on the particulars of the App.

  10. There is a whistleblower process by which an insider can anonymously, and effectively, raise issues with any aspect of technology, process or governance related to the App.

    Such a whistleblower process may be as simple as allowing insiders to log issues in the public bug tracker using an identity other than their work identity.

Best practices: governance

  1. The full set of organizations is publicly documented that are participating in any aspect of App creation, evolution, operations or governance, with their respective App Creator roles.

  2. The App’s financing and business model is publicly documented. It is consistent with all other information available about the App, and clearly explains which of the App Creator organizations provides or obtains resources to or from the App.

  3. Governance decisions are publicly documented, e.g. as meeting minutes or in the issue tracking system.

  4. Clear remedies exist that App Users can trigger should App Creators breach their trust. These remedies may exist under contract law or other forms of law.

  5. All independent reviews are available to the general public. This includes positive and negative reviews; of course, App Creators are free to publish any rebuttal.