From 0da94ed5d0201e0a773a9d54edea8fefd8555c4c Mon Sep 17 00:00:00 2001 From: leo Date: Thu, 25 May 2023 04:08:10 +0200 Subject: [PATCH] tex: edit stuff in complexity --- tex/text.tex | 25 +++++++++++++------------ 1 file changed, 13 insertions(+), 12 deletions(-) diff --git a/tex/text.tex b/tex/text.tex index 67bd8aa..c572341 100644 --- a/tex/text.tex +++ b/tex/text.tex @@ -336,20 +336,21 @@ internet is no easy task. \n{3}{Complexity} Browsers these days are also quite ubiquitous programs running on -\emph{billions} of consumer-grade mobile devices (which are also notorious for +\emph{billions} of consumer grade mobile devices (which are also notorious for bad update hygiene) or desktop devices all over the world. Regular users usually expect them to work flawlessly with a multitude of network conditions, -network scenarios (café WiFi, cellular data in a remote location, home -broadband that is DNS-poisoned by the ISP), differently tuned (or commonly -misconfigured) web servers, a combination of modern and \emph{legacy} -encryption schemes and different levels of conformance to web standards from -both web server and website developers. Of course, if a website is broken, it -is the browser's fault. Browsers are expected to detect if \emph{captive -portals} (a type of access control that usually tries to force the user through -a webpage with terms of use) are active and offer redirects. All of this is -immense complexity and the combination of ubiquity and great exposure this type -of software gets is in the authors opinion the cause behind a staggering amount -of vulnerabilities found, reported and fixed in browsers every year. +network scenarios (the proverbial café WiFi, cellular data in a remote +location, home broadband that is DNS-poisoned by the ISP), differently tuned +(or commonly misconfigured) web servers, a combination of modern and +\emph{legacy} encryption schemes and different levels of conformance to web +standards from both web server and website developers. Of course, if a website +is broken, it is the browser's fault. Browsers are expected to detect if +\emph{captive portals} (a type of access control that usually tries to force +the user through a webpage with terms of use) are active and offer redirects. +All of this is immense complexity and the combination of ubiquity and great +exposure this type of software gets is in the authors opinion the cause behind +a staggering amount of vulnerabilities found, reported and fixed in browsers +every year. \n{3}{Standardisation}