I highly recommend working through Claude Shannon's Mathematical Theory of Communications [0]. It's originally a paper but was later restructured as a book, in either form it works quite well.
The reason I recommend it is because it shows mathematical reasoning that is easy to follow and relevant to your daily life. It's real math, but very easy to read through and understand. If your unfamiliar this paper is where the very idea of "bits" comes from.
One of the most important things in the paper for non-mathematicians to see is that the definition Information Entropy is derived simply from the mathematical properties Shannon desires it to have.
This is important because I find that one of the biggest questions people ask about mathematical formula and idea is "What does this mean? Why is it this way?" without realizing that math is really not engineering nor physics. When deriving his definition of Information, Shannon simply states that information should have the following x,y... properties and then goes on to show that the now standard definition of information meets all these criteria.
In mathematics it is very often the case that only after an idea is created to we start realizing the applications. This is quite different than science where a model is only adopted if it correctly describes a physical process.
Work through the paper and you will have worked through the mathematical underpinnings of the information age and will likely have understood most of it pretty well.
The reason I recommend it is because it shows mathematical reasoning that is easy to follow and relevant to your daily life. It's real math, but very easy to read through and understand. If your unfamiliar this paper is where the very idea of "bits" comes from.
One of the most important things in the paper for non-mathematicians to see is that the definition Information Entropy is derived simply from the mathematical properties Shannon desires it to have.
This is important because I find that one of the biggest questions people ask about mathematical formula and idea is "What does this mean? Why is it this way?" without realizing that math is really not engineering nor physics. When deriving his definition of Information, Shannon simply states that information should have the following x,y... properties and then goes on to show that the now standard definition of information meets all these criteria.
In mathematics it is very often the case that only after an idea is created to we start realizing the applications. This is quite different than science where a model is only adopted if it correctly describes a physical process.
Work through the paper and you will have worked through the mathematical underpinnings of the information age and will likely have understood most of it pretty well.
0. https://people.math.harvard.edu/~ctm/home/text/others/shanno...