Create new service with MemoryHigh and MemoryMax directives.
The content of /root/memory.sh:
Start the service:
After a while
And in the dmesg:
Create new service with MemoryHigh and MemoryMax directives.
The content of /root/memory.sh:
Start the service:
After a while
And in the dmesg:
Simplest possible service that runs /root/process.sh script:
The content of /root/process.sh:
Start the service:
Now add TasksMax=5 into [Service] section:
and restart the service:
An interesting article on IEEE about Blockchain’s Carbon and Environmental Footprints.
According to https://digiconomist.net:
Roles defined in OAuth 2.0:
Which flow to use:
Do not use Implicit flow or Resource Owner Password Credentials flow at all.
Strategies that can be used to limit access for a specific access token:
To verify an access token:
Internal applications (first-party) are those owned byt the enterprise. It may be self-hosted of SaaS. No need for user’s consent.
External applications (third-party) should require user’s consent.
Possible architectures of an application we are securing:
Securing native and mobile applications. How to return authorization code to the application:
Example authorization strategies:
Keycloak cal act as a cenralized authorization service through a functionality called Authorization Services.
For production set up:
Keycloak uses Apache Freemaker for templates.
The example below is based on Moodle but similar will apply to any open OpenID Connect (OIDC) Relying Party. Moodle URL used here as example is https://test.pycert.org. Replace it with your installation URL.
Login to github and register new oauth app.
From the next page note:
Go to Moodle, login as admin. Under: Site administration -> Authentication -> Manage authentication (https://test.pycert.org/admin/settings.php?section=manageauths) enable OAuth2.
Go to Site administration -> Server -> OAuth 2 services and click on “Custom”.
After submit, on the admin/tool/oauth2/issuers.php page click “Configure endpoints” icon. Add following endpoints (Name, URL):
Logout (or use another browser i.e. in incognito mode) and try to login to your Moodle site using github account.
If you receive an error:
The user information returned did not contain a username and email address. The OAuth 2 service may be configured incorrectly.
It means that your emails are kept private in github settings (setting “Keep my email addresses private”). The email is not passed from github to Moodle during the OAuth workflow. And since Moodle requires login & email address at the very least when creating the account, it fails. One way to fix it is to set “Keep my email addresses private” to false in github.
I ended up patching core Moodle and adding an extra call to https://api.github.com/user/emails to retrieve an email.
How to list TCP/IP connections per remote IP, ordered by count:
Display certificate information:
Send GET request to https://muras.eu and use muras.eu for SNI.
Verify certificate chain:
List of the error codes:
Cheatsheet for ranger - a console file manager with VI key bindings, based on heroheman.
I definitely recommend the book to anyone interested in security.
Below are some quotes I’ve enjoyed.
[…] people do logic much better if the problem is set in social role. In the Wason test, subjects are told they have to inspect some cards with a letter grade on one side, and a numerical code on the other, and given a rule such as “If a student has a grade D on the front of their card, then the back must be marked with code 3”. They are shown four cards displaying (say) D, F, 3 and 7 and then asked “Which cards do you have to turn over to check that all cards are marked correctly?” Most subjects get this wrong; in the original experiment, only 48% of 96 subjects got the right answer of D and 7. However the evolutionary psychologists Leda Cosmides and John Tooby argue found the same problem becomes easier if the rule is change to “If a person is drinking beer, he must be 20 years old” and the individuals are a beer drinker, a coke drinker, a 25-year-old and a 16 year old. Now three-quarters of subjects deduce that the bouncer should check the age of the beer drinker and the drink of 16-year-old. Cosmides and Tooby argue that our ability to do logic and perhaps arithmetic evolved as a means of policing social exchanges.
Six main classes of techniques used to influence people and close a sale:
One chain of cheap hotels in France introduced self service. You’d turn up at that hotel, swipe your credit card in the reception machine, and get a receipt with a numerical access code to unlock your room door. To keep costs down, the rooms did not have en-suite bathrooms. A common failure mode was that you’d get up in the middle of the night to go to the bathroom, forget your access code, and realise you hadn’t taken the receipt with you. So you’d have to sleep on the bathroom floor until the staff arrived the following morning.
[…] many security failures weren’t due to technical errors so much as to wrong incentives: if the people who guard a system are not the people who suffer when it fails, then you can expect trouble.
[…] suppose that there are 100 used cars for sale in a town: 50 well maintained cars worth $2000 each, and 50 “lemons” worth $1000. The sellers know which is which, but the buyers don’t. What is the market price of a used car? You might think $1500; but at that price, no good cars will be offered for sale. So the market price will be close to $1000. This is why, if you buy a new car, maybe 20% falls off the price of the second you drive it out of the dealer’s lot. […] When users can’t tell good from bad, they might as well buy the cheapest.
Typical corporate policy language:
[…] it’s inevitable that your top engineers will be so much more knowledgeable than your auditors that they could do bad things if they really wanted to.
The big audit firms have a pernicious effect on the information security world bby pushing their own list of favourite controls, regardless of the client’s real risks. They maximise their income by nit-picking and compliance; the Sarbanes-Oxley regulations cost the average US public company over $1m a year in audit fees.
The banks’ response was intrusion detection systems that tried to identify criminal businesses by correlating the purchase histories of customers who complained. By the late 1990s, the smarter crooked businesses learned to absorb the cost of the customer’s transaction. You have a drink at Mafia-owned bistro, offer a card, sign the voucher, and fail to notice when the charge doesn’t appear on your bill. A month or two later, there’s a huge bill for jewelry, electrical goods or even casino chips. By then you’ve forgotten about the bistro, and the bank never had record of it.
[…] NIST Dual-EC-DRBG, which was built into Windows and seemed to have contained as NSA trapdoor; Ed Snowden later confirmed that the NSA paid RSA $10m to use this standard in tools that many tech companies licensed.
One national-security concern is that as defence systems increasingly depend on chips fabricated overseas, the fabs might introduce extra circuitry to facilitate later attack. For example, some extra logic might cause a 64-bit multiply with two specific inputs to function as a kill switch.
Another example is that a laser pulse can create a click on a microphone, so a voice command can be given to a home assistant through a window.
Countries that import their telephone exchanges rather than building their own just have to assume that their telephone swithchgear has vulnerabilities known to supplier’s government. (During the invasion of Afghanistan in 2001, Kabul had two exchanges: an old electromechanical one and a new electronic one. The USAF bombed only the first.)
Traffic analysis - looking at the number of messages by source and destination - can also give very valuable information. Imminent attacks were signalled in World War 1 by greatly increased volume of radio messages, and more recently by increased pizza deliveries to the Pentagon.
[…] meteor burst transmission (also known as meteor scatter). This relies on the billions of micrometeorites that strike the Earth’s atmosphere each day, each leaving a long ionization trail that persists for typically a third of a second and provides a temporary transmission path between a mother station and an area of maybe a hundred miles long and a few miles wide. The mother station transmits continuously; whenever one of the daughters is within such an area, it hears mother and starts to send packets of data at high speed, to which mother replies. With the low power levels used in covert operations one can achieve an average data rate of about 50 bps, with an average latency of about 5 minutes and a range of 500 - 1500 miles. Meteor burst communications are used by special forces, and civilian applications such as monitoring rainfall in remote parts of the third world.
[…] the United States was deploying “neutron bombs” in Europe - enhanced radiations weapons that could kill people without demolishing buildings. The Soviets portrayed this as a “capitalist bomb” that would destroy people while leaving property intact, and responded by threatening a “socialist bomb” to destroy property (in the form of electronics) while leaving the surrounding people intact.
A certain level of sharing was good for business. People who got a pirate copy of a tool and liked it would often buy a regular copy, or persuade their employer to buy one. In 1998 Bill Gates even said, “Although about three million computers get sold every year in China, people don’t pay for the software. Someday they will, though. And as long as they’re going to steal it, we want them to steal ours. They’ll get sort of addicted, and then we’ll somehow figure out how to collect sometime in the next decade”
[…] one cable TV broadcast a special offer for a free T-shirt, and stopped legitimate viewers from seeing the 0800 number to call; this got them a list of the pirates’ customers.
Early in the lockdown, some hospitals didn’t have enough batteries for the respirators used by their intensive-care clinicians, now they were being used 24x7 rather than occasionally. The market-leading 3M respirators and the batteries that powered them had authentication chips, so the company could sell batteries for over $200 that cost $5 to make. Hospitals would happily have bought more for $200, but China had nationalised the factory the previous month, and 3M wouldn’t release the keys to other component suppliers.
How could the banking industry’s thirst for a respectable cipher be slaked, not just in the USA but overseas, without this cipher being adopted by foreign governments and driving up the costs of intelligence collection? The solution was the Data Encryption Standard (DES). At the time, there was controversy about whether 56 bits were enough. We know now that this was deliberate. The NSA did not at the time have the machinery to do DES keysearch; that came later. But by giving the impression that they did, they managed to stop most foreign governments adopting it. The rotor machines continues in service, in many cases reimplemented using microcontrollers […] the traffic continued to be harvested. Foreigners who encrypted their important data with such ciphers merely marked that traffic as worth collecting.
Most of the Americans who died as a result of 9/11 probably did so since then in car crashes, after deciding to drive rather than fly: the shift from flying to driving led to about 1,000 extra fatalities in the following three months alone, and about 500 a year since then.
So a national leader trying to keep a country together following an attack should constantly remind people what they’re fighting for. This is what the best leaders do, from Churchil’s radio broadcasts to Roosvelt’s fireside chats.
IBM has separated the roles of system analyst, programmer and tester; the analyst spoke to the customer and produced a design, which programmer coded, and then the tester looked for bugs in the code. The incentives weren’t quite right, as the programmer could throw lots of buggy code over the fence and hope that someone else would fix it. This was slow and led to bloated code. Microsoft abolished the distinction between analyst, programmers and testers; id had only developers, who spoke to the customer and were also responsible for fixing their own bugs. This held up the bad programmers who wrote lots of bugs, so that more of the code was produces by the more skillful and careful developers. According to Steve Maguire, this is what enabled Microsoft to win the battle to rule the world of 32-bit operating systems;
Bezos’ law says you can’t run a dev project with more people that can be fed from two pizzas.
Another factor in team building is he adoption of a standard style. One signal of poorly-managed teams is that the codebase is in a chaotic mixture of styles, with everybody doing their own thing. When a programmer checks out some code to work on it, they may spend half an hour formatting it and tweaking it into their style. Apart from the wasted time, reformatted code can trip up your analysis tools.
When you really want a protection property to hold, it’s vital that the design and implementation be subjected to hostile review. It will be eventually, and it’s likely to be cheaper if it’s done before the system is fielded. As we’ve been seening one case history after another, the motivation of the attacker is critical; friendly reviews, by people who want the system to pass, are essentially useless compared with contributions by people who are seriously trying to break it.