Create new service with MemoryHigh and MemoryMax directives.

$ systemctl edit --force --full memory.service
[Unit]
Description=Simple service to test memory limit.

[Service]
ExecStart=/root/memory.sh
MemoryHigh=1M
MemoryMax=2M

[Install]
WantedBy=multi-user.target

The content of /root/memory.sh:

#!/bin/bash

echo $(date) > /tmp/test.log
a=()

for (( a=1; a<=10; a++ ))
do
    echo Loop $a >> /tmp/test.log
    for (( c=1; c<=600000; c++ ))
    do
        a+=( "abcdefghijklmnopqrstquvxyabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzzabcdefghijklmnopqrstquvxyabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzzabcdefghijklmnopqrstquvxyabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzabcdefghijklmnopqrstquvxyzz" )
    done
done

sleep 10

Start the service:

root@tuxedo:/etc/systemd/system# systemctl daemon-reload
root@tuxedo:/etc/systemd/system# systemctl enable --now memory
root@tuxedo:/etc/systemd/system# systemctl status memory
● memory.service - Simple service to test memory limit.
Loaded: loaded (/etc/systemd/system/memory.service; enabled; vendor preset: enabled)
Active: active (running) since Thu 2022-09-01 22:26:27 CEST; 9s ago
Main PID: 14675 (memory.sh)
Tasks: 1 (limit: 76224)
Memory: 1.9M (high: 1.0M max: 2.0M)
CGroup: /system.slice/_memory.service
└─14675 /bin/bash /root/memory.sh

wrz 01 22:26:27 tuxedo systemd[1]: Started Simple service to test memory limit..

After a while

root@tuxedo:/etc/systemd/system# systemctl status memory
● memory.service - Simple service to test memory limit.
Loaded: loaded (/etc/systemd/system/memory.service; enabled; vendor preset: enabled)
Active: failed (Result: signal) since Thu 2022-09-01 22:27:31 CEST; 8s ago
Process: 14675 ExecStart=/root/memory.sh (code=killed, signal=KILL)
Main PID: 14675 (code=killed, signal=KILL)

wrz 01 22:26:27 tuxedo systemd[1]: Started Simple service to test memory limit..
wrz 01 22:27:31 tuxedo systemd[1]: memory.service: Main process exited, code=killed, status=9/KILL
wrz 01 22:27:31 tuxedo systemd[1]: memory.service: Failed with result 'signal'.

And in the dmesg:

$ dmesg
[ 5679.682307] Tasks state (memory values in pages):
[ 5679.682308] [  pid  ]   uid  tgid total_vm      rss pgtables_bytes swapents oom_score_adj name
[ 5679.682310] [  14675]     0 14675   202158      862  1646592   199134             0 memory.sh
[ 5679.682316] oom-kill:constraint=CONSTRAINT_MEMCG,nodemask=(null),cpuset=/,mems_allowed=0,oom_memcg=/system.slice/_memory.service,task_memcg=/system.slice/_memory.service,task=memory.sh,pid=14675,uid=0
[ 5679.682330] Memory cgroup out of memory: Killed process 14675 (memory.sh) total-vm:808632kB, anon-rss:0kB, file-rss:3448kB, shmem-rss:0kB, UID:0 pgtables:1608kB oom_score_adj:0

Simplest possible service that runs /root/process.sh script:

root@tuxedo:/etc/systemd/system# cat process.service

[Unit]
Description=Simple service to test process limit.

[Service]
ExecStart=/root/process.sh

[Install]
WantedBy=multi-user.target

The content of /root/process.sh:

#!/bin/bash

echo $(date) >> /tmp/test.log
for (( c=1; c<=10; c++ ))
do
sleep 10 &
done

sleep 20

Start the service:

root@tuxedo:/etc/systemd/system# systemctl daemon-reload
root@tuxedo:/etc/systemd/system# systemctl enable --now process
root@tuxedo:/etc/systemd/system# systemctl status process
● process.service - Simple service to test process limit.
Loaded: loaded (/etc/systemd/system/process.service; enabled; vendor preset: enabled)
Active: active (running) since Tue 2022-08-30 22:59:03 CEST; 2s ago
Main PID: 18596 (process.sh)
Tasks: 12 (limit: 76224)
Memory: 2.3M
CGroup: /system.slice/process.service
├─18596 /bin/bash /root/process.sh
├─18598 sleep 10
├─18599 sleep 10
├─18600 sleep 10
├─18601 sleep 10
├─18602 sleep 10
├─18603 sleep 10
├─18604 sleep 10
├─18605 sleep 10
├─18606 sleep 10
├─18607 sleep 10
└─18608 sleep 20

sie 30 22:59:03 tuxedo systemd[1]: Started Simple service to test process limit..

Now add TasksMax=5 into [Service] section:

[Unit]
Description=Simple service to test process limit.

[Service]
ExecStart=/root/process.sh
TasksMax=5

[Install]
WantedBy=multi-user.target

and restart the service:

root@tuxedo:/etc/systemd/system# systemctl daemon-reload
root@tuxedo:/etc/systemd/system# systemctl restart process
root@tuxedo:/etc/systemd/system# systemctl status process
● process.service - Simple service to test process limit.
Loaded: loaded (/etc/systemd/system/process.service; enabled; vendor preset: enabled)
Active: active (running) since Tue 2022-08-30 23:04:20 CEST; 9s ago
Main PID: 18961 (process.sh)
Tasks: 5 (limit: 5)
Memory: 1.0M
CGroup: /system.slice/process.service
├─18961 /bin/bash /root/process.sh
├─18963 sleep 10
├─18964 sleep 10
├─18965 sleep 10
└─18966 sleep 10

sie 30 23:04:20 tuxedo systemd[1]: Started Simple service to test process limit..
sie 30 23:04:20 tuxedo process.sh[18961]: /root/process.sh: fork: retry: Resource temporarily unavailab>
sie 30 23:04:21 tuxedo process.sh[18961]: /root/process.sh: fork: retry: Resource temporarily unavailab>
sie 30 23:04:23 tuxedo process.sh[18961]: /root/process.sh: fork: retry: Resource temporarily unavailab>
sie 30 23:04:27 tuxedo process.sh[18961]: /root/process.sh: fork: retry: Resource temporarily unavailab>
sie 30 23:04:30 tuxedo process.sh[18961]: /root/process.sh: fork: Interrupted system call
sie 30 23:04:30 tuxedo systemd[1]: process.service: Main process exited, code=exited, status=254/n/a
sie 30 23:04:30 tuxedo systemd[1]: process.service: Failed with result 'exit-code'.


Keycloak_cover

Roles defined in OAuth 2.0:

  • Resource owner
  • Resource server
  • Client
  • Authorization server

Which flow to use:

  • Client Credentials flow if the application is accessing the resource on behalf of itself (the application is the resource owner)
  • Device flow if the application is running on a device without a browser or is input-constrained (i.e. smart TV)
  • Otherwise Authorization Code flow.

Do not use Implicit flow or Resource Owner Password Credentials flow at all.

Strategies that can be used to limit access for a specific access token:

  • Audience
  • Roles
  • Scope

To verify an access token:

  • Retrieve public key from JWKS endpoint
  • Verify the signature
  • Verify that token is not expired
  • Verify the issuer, audience and type of token
  • Verify any other claims that your application cares about

Internal applications (first-party) are those owned byt the enterprise. It may be self-hosted of SaaS. No need for user’s consent.

External applications (third-party) should require user’s consent.

Possible architectures of an application we are securing:

  • Server side
  • SPA (Single Page Application) with dedicated REST API under the same domain.
  • SPA with intermediary API under the same domain (which it turn may call external APIs)
  • SPA with external API

Securing native and mobile applications. How to return authorization code to the application:

  • Claimed HTTPS scheme. Native application claims scheme like https://my.app.org. Such URL then open in the app instead of a browser.
  • Custom URI scheme. Such request is sent (open) by application. Example: org.app.my://oauth2/provider-name .
  • Loopback interface. Application opens temporary web server on random port,i.e. https://127.0.0.1:2345
  • Special redirect URI, i.e. urn:ietf:wg:oauth:2.0:oob - authorization code is displayed by Keycloak for manual copy & paste.

Example authorization strategies:

  • role-based access control (RBAC). Keycloak has realm and client-level roles.
  • group-based access control (GBAC)
  • OAuth2 scopes
  • attribute-based access control (ABAC)

Keycloak cal act as a cenralized authorization service through a functionality called Authorization Services.

For production set up:

  • properties.frontendURL
  • properties.forceBackendUrlToFrontendURL
  • properties.adminURL (make it private URL)
  • TLS
  • production database (encryption in transit and at rest)
  • possibly clustering and load balancing
  • password policy
  • enable refresh token rotation
  • use ECDSA (Elliptic Curve Digital Signature Algorithm) for signatures instead of RSA

Keycloak uses Apache Freemaker for templates.


The example below is based on Moodle but similar will apply to any open OpenID Connect (OIDC) Relying Party. Moodle URL used here as example is https://test.pycert.org. Replace it with your installation URL.

Login to github and register new oauth app.

  • Application name can be anything you want
  • Homepage URL is your Moodle URL, i.e. https://test.pycert.org
  • Authorization callback URL is your Moodle URL + /admin/oauth2callback.php, i.e. https://test.pycert.org/admin/oauth2callback.php

From the next page note:

  • Client ID
  • Client secret (Generate a new client secret)

Go to Moodle, login as admin. Under: Site administration -> Authentication -> Manage authentication (https://test.pycert.org/admin/settings.php?section=manageauths) enable OAuth2.

Go to Site administration -> Server -> OAuth 2 services and click on “Custom”.

  • Name - anything you want, i.e. github.
  • Client ID - copied from github, from your note above.
  • Client secret - like above.
  • Authenticate token requests via HTTP headers - set to true.
  • Service base URL - https://github.com
  • Logo URL - https://github.com/favicon.ico
  • This service will be used - Login page only
  • Name displayed on the login page - anything you want, i.e. github.
  • Scopes included in a login request - user:email
  • Scopes included in a login request for offline access - user:email
  • Additional parameters included in a login request - leave empty.
  • Additional parameters included in a login request for offline access - leave empty.
  • Login domains - leave empty.
  • Require email verification - leave checked (true).

After submit, on the admin/tool/oauth2/issuers.php page click “Configure endpoints” icon. Add following endpoints (Name, URL):

  • authorization_endpoint, https://github.com/login/oauth/authorize
  • token_endpoint, https://github.com/login/oauth/access_token
  • userinfo_endpoint, https://api.github.com/user

Logout (or use another browser i.e. in incognito mode) and try to login to your Moodle site using github account.

If you receive an error:

The user information returned did not contain a username and email address. The OAuth 2 service may be configured incorrectly.

It means that your emails are kept private in github settings (setting “Keep my email addresses private”). The email is not passed from github to Moodle during the OAuth workflow. And since Moodle requires login & email address at the very least when creating the account, it fails. One way to fix it is to set “Keep my email addresses private” to false in github.

I ended up patching core Moodle and adding an extra call to https://api.github.com/user/emails to retrieve an email.



Display certificate information:

$ ➜ openssl s_client -connect muras.eu:443
CONNECTED(00000003)
depth=2 C = US, O = Internet Security Research Group, CN = ISRG Root X1
verify return:1
depth=1 C = US, O = Let's Encrypt, CN = R3
verify return:1
depth=0 CN = ala.muras.eu
verify return:1
---
Certificate chain
0 s:CN = ala.muras.eu
i:C = US, O = Let's Encrypt, CN = R3
1 s:C = US, O = Let's Encrypt, CN = R3
i:C = US, O = Internet Security Research Group, CN = ISRG Root X1
2 s:C = US, O = Internet Security Research Group, CN = ISRG Root X1
i:O = Digital Signature Trust Co., CN = DST Root CA X3
---
Server certificate
-----BEGIN CERTIFICATE-----
MIIFZjCCBE6gAwIBAgISAytgxCG8Nfa5gbAkMQHXSwOMMA0GCSqGSIb3DQEBCwUA
MDIxCzAJBgNVBAYTAlVTMRYwFAYDVQQKEw1MZXQncyBFbmNyeXB0MQswCQYDVQQD
EwJSMzAeFw0yMjA0MDQxODA5MzFaFw0yMjA3MDMxODA5MzBaMBcxFTATBgNVBAMT
DGFsYS5tdXJhcy5ldTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBALg4
9WBf1tHJNysqDl6bTKj+8no8+QSV/xqxfpcgr9uIEUTYbJtHHNFHDi1QjaufaDBG
ryZsAUO5VfxHygPH93WQc4qX3ZQoaZ7+xA4QjGwR4zJw3CqdQNXXXfoW456iIHrz
EgzSf6KctnQg8VBGhnTqE0ZZN3QTHtLoRy2J/RcTl0z48SLBS60EpeOmIzjek5X1
mii+ZznEa3R+zat9bXxVxiwhFvxS+bhClEUrFYI5I5zPOs7ByUstc2c6Tws1wW2y
R4CEsuLcwvHSH6W7dN3CPjYZ5TbuYuprGxEgYSDJRN07bipy95R4BrHiKAk6R66a
UHlho4KwK7tnjs9VSdkCAwEAAaOCAo8wggKLMA4GA1UdDwEB/wQEAwIFoDAdBgNV
HSUEFjAUBggrBgEFBQcDAQYIKwYBBQUHAwIwDAYDVR0TAQH/BAIwADAdBgNVHQ4E
FgQUObZv+7j4EQHj5orKa2O1i0Yhd7UwHwYDVR0jBBgwFoAUFC6zF7dYVsuuUAlA
5h+vnYsUwsYwVQYIKwYBBQUHAQEESTBHMCEGCCsGAQUFBzABhhVodHRwOi8vcjMu
by5sZW5jci5vcmcwIgYIKwYBBQUHMAKGFmh0dHA6Ly9yMy5pLmxlbmNyLm9yZy8w
YAYDVR0RBFkwV4IMYWxhLm11cmFzLmV1ggxkb2MubXVyYXMuZXWCEG1pa29sYWou
bXVyYXMuZXWCD21vbmljYS5tdXJhcy5ldYIIbXVyYXMuZXWCDHd3dy5tdXJhcy5l
dTBMBgNVHSAERTBDMAgGBmeBDAECATA3BgsrBgEEAYLfEwEBATAoMCYGCCsGAQUF
BwIBFhpodHRwOi8vY3BzLmxldHNlbmNyeXB0Lm9yZzCCAQMGCisGAQQB1nkCBAIE
gfQEgfEA7wB1AEHIyrHfIkZKEMahOglCh15OMYsbA+vrS8do8JBilgb2AAABf/X7
deIAAAQDAEYwRAIgL+/+47ymSnPD786/vSsLAe9DnvdPSDhzB95iDJWRjBECIAYI
AwwP6sQhB852PAq2ImsgJC0UGrmr3BodVWjnRcMFAHYARqVV63X6kSAwtaKJafTz
fREsQXS+/Um4havy/HD+bUcAAAF/9ft2BgAABAMARzBFAiAYmpaYKA4Rklxe7KF2
3faQo5WQzwIQGMG/EBHsj55bWgIhAN/AyVz5PZ5x74R1otpwH+ULFcbyodU2TjrV
tmJMi1QSMA0GCSqGSIb3DQEBCwUAA4IBAQBTRMekA7B8D3EHvHPVFsjCePvWUX1D
sDTX/HJIAZ+L7szjQLZKHvDZRuoCceikZmGV4aFIdyt+jlEQneJVFj5QCEtjjjiI
j1eTEGSnotHXRAQeW1sjtGgSLWXrRJsLJNqzLfXw25/XJgSK/KIwuvh+KI32kaYl
+95nd1FHwZshNgttC8ihTFBQWijJVV6sOeyGE3JZHWBDQfjp7kbUvGxfLIi1ziWM
6ry0+FcICtVMWwLbQi4HMxax2PvTdCCQZCrOaWiM1xQ/p4k1p3iY7fyTdl9Sr6yr
Y+m6RPgVr/JEIKWGQtWCwtqk0TzrOUwIBIw5xU1HyA5hz7vOrzxeROSM
-----END CERTIFICATE-----
subject=CN = ala.muras.eu

issuer=C = US, O = Let's Encrypt, CN = R3

---
No client certificate CA names sent
Peer signing digest: SHA256
Peer signature type: RSA-PSS
Server Temp Key: X25519, 253 bits
---
SSL handshake has read 4642 bytes and written 380 bytes
Verification: OK
---
New, TLSv1.3, Cipher is TLS_AES_256_GCM_SHA384
Server public key is 2048 bit
Secure Renegotiation IS NOT supported
Compression: NONE
Expansion: NONE
No ALPN negotiated
Early data was not sent
Verify return code: 0 (ok)
---
---
Post-Handshake New Session Ticket arrived:
SSL-Session:
Protocol  : TLSv1.3
Cipher    : TLS_AES_256_GCM_SHA384
Session-ID: 1A3E72C7B0EE867B11C105EFAE3C39CE4FB149B0EBCFD836792AA44161637204
Session-ID-ctx:
Resumption PSK: 5A4A7C751C76D7AED82A657151666E9AA15749C001A641871C2F70295EFCEDC9CFBE6FDDCD9C2EC151086FE17DE8222F
PSK identity: None
PSK identity hint: None
SRP username: None
TLS session ticket lifetime hint: 300 (seconds)
TLS session ticket:
0000 - 92 7a dd 24 74 a4 76 ba-76 7a 9f 79 3b 9c 35 bc   .z.$t.v.vz.y;.5.
0010 - ba 5f e8 bb 82 4a 84 47-7f 9d d0 0a f9 fa ab f3   ._...J.G........

    Start Time: 1652895551
    Timeout   : 7200 (sec)
    Verify return code: 0 (ok)
    Extended master secret: no
    Max Early Data: 0
---
read R BLOCK
---
Post-Handshake New Session Ticket arrived:
SSL-Session:
Protocol  : TLSv1.3
Cipher    : TLS_AES_256_GCM_SHA384
Session-ID: B90C92D0D4BF52C38E33C8C28210D43137E0642528C4F7FEEF9F77E06D5484CF
Session-ID-ctx:
Resumption PSK: 27AFDE9BBFFCCC660EC5C73053BCB7837A002ACBD7E7233397A834439B5514CDBDE8B4FDC2002970A61A2764262ABBAA
PSK identity: None
PSK identity hint: None
SRP username: None
TLS session ticket lifetime hint: 300 (seconds)
TLS session ticket:
0000 - 59 d0 96 33 41 f2 23 a7-45 87 d3 57 e5 eb 5f ba   Y..3A.#.E..W.._.
0010 - 58 18 2f 31 f6 28 ef 21-5e e6 e7 34 2f f0 43 72   X./1.(.!^..4/.Cr

    Start Time: 1652895551
    Timeout   : 7200 (sec)
    Verify return code: 0 (ok)
    Extended master secret: no
    Max Early Data: 0
---
read R BLOCK

Send GET request to https://muras.eu and use muras.eu for SNI.

$ echo -e "GET / HTTP/1.1\r\nHost: muras.eu\r\nConnection: close\r\n\r\n" |  openssl s_client -quiet -connect muras.eu:443 -servername muras.eu
depth=2 C = US, O = Internet Security Research Group, CN = ISRG Root X1
verify return:1
depth=1 C = US, O = Let's Encrypt, CN = R3
verify return:1
depth=0 CN = ala.muras.eu
verify return:1
HTTP/1.1 200 OK
Date: Wed, 18 May 2022 17:29:56 GMT
Server: Apache/2.4.48 (Ubuntu)
Last-Modified: Tue, 08 Feb 2022 17:44:05 GMT
ETag: "11c3c-5d785436c2f8e"
Accept-Ranges: bytes
Content-Length: 72764
Vary: Accept-Encoding
Connection: close
Content-Type: text/html

<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1">
    <title>Tomasz Muras</title>
    <meta name="description" content="Technical blog.">
...

Verify certificate chain:

# Store all certificates
$ openssl s_client -connect muras.eu:443  -showcerts > cert.pem
# Extract them into cert1.pem  cert2.pem  cert3.pem
# Verify
$ openssl verify -CAfile cert2.pem cert1.pem
cert1.pem: OK

List of the error codes:

$ man verify

Solving_Identity_Management_cover

Design Questions

  • Who are your users: employees (B2E), consumers (B2C), or a business (B2B)?
  • How will users log in? Is there an existing account available to them that they would like to reuse?
  • Can your application be used anonymously or is authentication needed?
  • What kind of delivery - Web or native - does your application intend to provide?
  • Will your application need to call any APIs? If so, who owns the data that your application will retrieve?
  • How sensitive is the data that your application handles?
  • What access control requirements are needed?
  • How long should a user’s session last?
  • Is there more than one application in your system? If so, will users benefit from single sign-on? (Don’t forget a support forum!)
  • What should happen when users log out?
  • Are there any compliance requirements associated with this data?

Events in the Life of an Identity

  • Provisioning
  • Authorization
  • Authentication
  • Access Policy Enforcement
  • Sessions
  • Single Sign-On
  • Stronger Authentication
  • Logout
  • Account Management
  • Deprovisioning

Levels of Authorization and Access Policy Enforcement

  • Level 1 - Whetever an entity can access an application ir API at all
  • Level 2 - What functions an entity can use in an application or API
  • Level 3 - What data and entity can access or operate on
  • Account - A construct withing a software application or service that usually contains or is associated with identity information and optionally privileges and which is used to access features within the application or service.
  • Identifier - A single identifying attribute that points to a unique individual user or entity, within a particular context.
  • Identity - A set of attributes, including one or more identifiers, associated with a specific user or entity, in a particular context.
  • Identity Repository - A collection of users stored in a computer storage system, such as a database or directory service.


Security_Engineering_cover

I definitely recommend the book to anyone interested in security.

Below are some quotes I’ve enjoyed.

Chapter “Psychology and Usability”

[…] people do logic much better if the problem is set in social role. In the Wason test, subjects are told they have to inspect some cards with a letter grade on one side, and a numerical code on the other, and given a rule such as “If a student has a grade D on the front of their card, then the back must be marked with code 3”. They are shown four cards displaying (say) D, F, 3 and 7 and then asked “Which cards do you have to turn over to check that all cards are marked correctly?” Most subjects get this wrong; in the original experiment, only 48% of 96 subjects got the right answer of D and 7. However the evolutionary psychologists Leda Cosmides and John Tooby argue found the same problem becomes easier if the rule is change to “If a person is drinking beer, he must be 20 years old” and the individuals are a beer drinker, a coke drinker, a 25-year-old and a 16 year old. Now three-quarters of subjects deduce that the bouncer should check the age of the beer drinker and the drink of 16-year-old. Cosmides and Tooby argue that our ability to do logic and perhaps arithmetic evolved as a means of policing social exchanges.

Six main classes of techniques used to influence people and close a sale:

  1. Reciprocity: most people feel the need to return favours;
  2. Commitment and consistency: people suffer cognitive dissonance if they feel they’re being inconsistent;
  3. Social proof: most people want the approval of others. This means following others in a group of which they’re a member, and the smaller the group the stronger the pressure;
  4. Liking: most people want to do what a good-looking or otherwise likeable person asks;
  5. Authority: most people are differential to authority figures;
  6. Scarcity: we’re afraid of missing out, if something we might want could suddenly be unavailable.

One chain of cheap hotels in France introduced self service. You’d turn up at that hotel, swipe your credit card in the reception machine, and get a receipt with a numerical access code to unlock your room door. To keep costs down, the rooms did not have en-suite bathrooms. A common failure mode was that you’d get up in the middle of the night to go to the bathroom, forget your access code, and realise you hadn’t taken the receipt with you. So you’d have to sleep on the bathroom floor until the staff arrived the following morning.

Chapter “Economics”

[…] many security failures weren’t due to technical errors so much as to wrong incentives: if the people who guard a system are not the people who suffer when it fails, then you can expect trouble.

[…] suppose that there are 100 used cars for sale in a town: 50 well maintained cars worth $2000 each, and 50 “lemons” worth $1000. The sellers know which is which, but the buyers don’t. What is the market price of a used car? You might think $1500; but at that price, no good cars will be offered for sale. So the market price will be close to $1000. This is why, if you buy a new car, maybe 20% falls off the price of the second you drive it out of the dealer’s lot. […] When users can’t tell good from bad, they might as well buy the cheapest.

Chapter “Multilevel Security”

Typical corporate policy language:

  1. This policy is approved by Management.
  2. All staff shall obey this security policy.
  3. Data shall be available only to those with a “need-to-know”.
  4. All breaches of this policy shall be reported at once to Security.

Chapter “Banking and Bookkeeping”

[…] it’s inevitable that your top engineers will be so much more knowledgeable than your auditors that they could do bad things if they really wanted to.

The big audit firms have a pernicious effect on the information security world bby pushing their own list of favourite controls, regardless of the client’s real risks. They maximise their income by nit-picking and compliance; the Sarbanes-Oxley regulations cost the average US public company over $1m a year in audit fees.

The banks’ response was intrusion detection systems that tried to identify criminal businesses by correlating the purchase histories of customers who complained. By the late 1990s, the smarter crooked businesses learned to absorb the cost of the customer’s transaction. You have a drink at Mafia-owned bistro, offer a card, sign the voucher, and fail to notice when the charge doesn’t appear on your bill. A month or two later, there’s a huge bill for jewelry, electrical goods or even casino chips. By then you’ve forgotten about the bistro, and the bank never had record of it.

Chapter “Tamper Resistance”

[…] NIST Dual-EC-DRBG, which was built into Windows and seemed to have contained as NSA trapdoor; Ed Snowden later confirmed that the NSA paid RSA $10m to use this standard in tools that many tech companies licensed.

One national-security concern is that as defence systems increasingly depend on chips fabricated overseas, the fabs might introduce extra circuitry to facilitate later attack. For example, some extra logic might cause a 64-bit multiply with two specific inputs to function as a kill switch.

Chapter “Side Channels”

Another example is that a laser pulse can create a click on a microphone, so a voice command can be given to a home assistant through a window.

Chapter “Phones”

Countries that import their telephone exchanges rather than building their own just have to assume that their telephone swithchgear has vulnerabilities known to supplier’s government. (During the invasion of Afghanistan in 2001, Kabul had two exchanges: an old electromechanical one and a new electronic one. The USAF bombed only the first.)

Chapter “Electronic and Information Warfare”

Traffic analysis - looking at the number of messages by source and destination - can also give very valuable information. Imminent attacks were signalled in World War 1 by greatly increased volume of radio messages, and more recently by increased pizza deliveries to the Pentagon.

[…] meteor burst transmission (also known as meteor scatter). This relies on the billions of micrometeorites that strike the Earth’s atmosphere each day, each leaving a long ionization trail that persists for typically a third of a second and provides a temporary transmission path between a mother station and an area of maybe a hundred miles long and a few miles wide. The mother station transmits continuously; whenever one of the daughters is within such an area, it hears mother and starts to send packets of data at high speed, to which mother replies. With the low power levels used in covert operations one can achieve an average data rate of about 50 bps, with an average latency of about 5 minutes and a range of 500 - 1500 miles. Meteor burst communications are used by special forces, and civilian applications such as monitoring rainfall in remote parts of the third world.

[…] the United States was deploying “neutron bombs” in Europe - enhanced radiations weapons that could kill people without demolishing buildings. The Soviets portrayed this as a “capitalist bomb” that would destroy people while leaving property intact, and responded by threatening a “socialist bomb” to destroy property (in the form of electronics) while leaving the surrounding people intact.

A certain level of sharing was good for business. People who got a pirate copy of a tool and liked it would often buy a regular copy, or persuade their employer to buy one. In 1998 Bill Gates even said, “Although about three million computers get sold every year in China, people don’t pay for the software. Someday they will, though. And as long as they’re going to steal it, we want them to steal ours. They’ll get sort of addicted, and then we’ll somehow figure out how to collect sometime in the next decade”

[…] one cable TV broadcast a special offer for a free T-shirt, and stopped legitimate viewers from seeing the 0800 number to call; this got them a list of the pirates’ customers.

Early in the lockdown, some hospitals didn’t have enough batteries for the respirators used by their intensive-care clinicians, now they were being used 24x7 rather than occasionally. The market-leading 3M respirators and the batteries that powered them had authentication chips, so the company could sell batteries for over $200 that cost $5 to make. Hospitals would happily have bought more for $200, but China had nationalised the factory the previous month, and 3M wouldn’t release the keys to other component suppliers.

Chapter “Surveillance or Privacy?”

How could the banking industry’s thirst for a respectable cipher be slaked, not just in the USA but overseas, without this cipher being adopted by foreign governments and driving up the costs of intelligence collection? The solution was the Data Encryption Standard (DES). At the time, there was controversy about whether 56 bits were enough. We know now that this was deliberate. The NSA did not at the time have the machinery to do DES keysearch; that came later. But by giving the impression that they did, they managed to stop most foreign governments adopting it. The rotor machines continues in service, in many cases reimplemented using microcontrollers […] the traffic continued to be harvested. Foreigners who encrypted their important data with such ciphers merely marked that traffic as worth collecting.

Most of the Americans who died as a result of 9/11 probably did so since then in car crashes, after deciding to drive rather than fly: the shift from flying to driving led to about 1,000 extra fatalities in the following three months alone, and about 500 a year since then.

So a national leader trying to keep a country together following an attack should constantly remind people what they’re fighting for. This is what the best leaders do, from Churchil’s radio broadcasts to Roosvelt’s fireside chats.

Chapter “Secure Systems Development”

IBM has separated the roles of system analyst, programmer and tester; the analyst spoke to the customer and produced a design, which programmer coded, and then the tester looked for bugs in the code. The incentives weren’t quite right, as the programmer could throw lots of buggy code over the fence and hope that someone else would fix it. This was slow and led to bloated code. Microsoft abolished the distinction between analyst, programmers and testers; id had only developers, who spoke to the customer and were also responsible for fixing their own bugs. This held up the bad programmers who wrote lots of bugs, so that more of the code was produces by the more skillful and careful developers. According to Steve Maguire, this is what enabled Microsoft to win the battle to rule the world of 32-bit operating systems;

Bezos’ law says you can’t run a dev project with more people that can be fed from two pizzas.

Another factor in team building is he adoption of a standard style. One signal of poorly-managed teams is that the codebase is in a chaotic mixture of styles, with everybody doing their own thing. When a programmer checks out some code to work on it, they may spend half an hour formatting it and tweaking it into their style. Apart from the wasted time, reformatted code can trip up your analysis tools.

Chapter “Assurance and Sustainability”

When you really want a protection property to hold, it’s vital that the design and implementation be subjected to hostile review. It will be eventually, and it’s likely to be cheaper if it’s done before the system is fielded. As we’ve been seening one case history after another, the motivation of the attacker is critical; friendly reviews, by people who want the system to pass, are essentially useless compared with contributions by people who are seriously trying to break it.