NOTE: THESE ARE JUST NOTES, PROBABLY FULL OF ERRORS!!
if you see an error, please email me to let me know, thanks
"A priori" knowledge is that which we can be sure of without any empirical evidence, where empirical evidence means evidence from our senses. So, "a priori" knowledge is stuff that you could figure out if you were just a mind floating in a void unconnected to a body. "A posteriori" or "empirical" knowledge is stuff that you would need empirical evidence for to believe.
In many fields, some people spend some time working on "theory", and consider this separate from "applied" work or "experimental" work. There are various definitions of "theory", but one is that you are working on theory when your goal is to understand something. This is as opposed to application, where your goal is to solve some problem, to produce something or to produce a method to do something, and to experiment, where your goal is to collect empirical data.
If theory is defined as to understand something, it explains some of the curious properties of theoretical work. When you work on theory, you don't know what you are going to discover, if you will discover anything at all, or how long it would take, because in order to predict these things you'd have to understand the domain that you are working on, but if you already understood it, you wouldn't have to do theory in the first place.
(thanks to R.F. for this idea)
Another aspect of "theory" is its connection with a priori reasoning. If a scientist or engineer says they are doing work in theory, generally they mean that they are spending much of their time deriving a set of logical consequences from assumptions. The assumptions might not be a priori, but given the assumptions, the consequences are. Either or both of the choice of assumptions, or the results sought from derivation, are usually dependent on empirical evidence or contingent circumstances, however. So a scientist might work on theory for awhile and then say, "From experimental evidence, we see that X and Y and Z. From a priori reasoning, I can show that if W and X and Y are true, then Z is implied; and further, Q is implied. Neither W and Q have yet been tested experimentally, but both could be". The fact that Z and Q are implied by W and X and Y might be surprising or at least non-obvious, and the derivation of Z from X and Y might take a lot of work, which is why this is an activity in itself.
I don't mean to say that, in producing such a theory, a theorist would actually start by picking out beforehand a W and an X and a Y and a Z and tries to find a derivation of Z from X and Y and W. This sometimes happens, but often, a theorist would just start making random somewhat plausible assumptions and playing around with them formally to see what turns up, being attentive for things that are already empirically known facts and also for testable hypotheses. After they find some interesting-seeming structure, they might try to change things around until they get some assumptions and/or conclusions that are more firmly grounded in experiment, and/or that can be feasibly tested. Let me repeat that for emphasis; the theorists that i know do not think carefully about exactly what they want to prove from exactly which assumptions and then try really hard to prove it; they start by (1) observing unexplained but quantified (or at least formally phrased) regularities in data, (2) haphazardly making up and just as quickly discarding plausible formal assumptions and fooling around with them to see if derivations starting from those have any interesting structure. Only after they discover a promising start do they go back and think about exactly what the assumptions and conclusions should be and work hard at adjusting their derivations to find a proof for those.
Note that there is another, more concrete definition of 'a theory' used in mathematical logic (namely, the set of consequences of a given set of axioms).
There are various ways of defining fields of study.
One way is by topic. Chemistry studies chemicals, biology studies living things, physics studies the laws underlying the physical world, math studies numbers, literature studies texts, history studies things that happened in the past involving humans, etc.
Another is by discipline, that is, what methods each field denies itself when producing knowledge. Math does not allow anything which is not proven. The sciences do not allow anything without empirical evidence. The experimental sciences do not allow anything which has not been confirmed by experiment. Etc. (an interesting but not essential essay relating to this topic is "Knowledge Become Self-conscious" by by Mortimer J Adler [1], which was the essay introduction to Part 10 of [2].
Another is by community. Chemistry is the community of researchers in chemistry departments, their students, etc.
The terms field, topic, discipline are used interchangably but this sometimes leads to arguments. For example, "interdisciplinary" topics seem like a good thing from the "topic" and "community" perspective (there are topics which are 'in between' the well known fields, so if you want to study them you end up working with people employed in multiple fields), but seem like a bad thing from the 'discipline' perspective (imagine for example an interdisciplinary branch of mathematical physics which, like physicists, accepts as results derivations which are not sufficiently rigorous to be mathematical proofs, and which, like mathematicians, accepts as results "not even wrong" theories consisting of interesting arrangements of formal machinery which cannot make empirical predictions distinct from preexisting theories. Such a community can be said to have no discipline!).
One sort of type of topic is logic. Logic is the investigation of universally certain, formal reasoning, where 'formal reasoning' means that we reason only on the basis of the form (e.g. the grammatical form) of statements, not on their content, and 'certain' means that, insofar as we are reasoning logically, we only want to reason using methods such that, if we could follow the method without error, then we could be absolutely certain of our conclusions, as opposed to merely 'beyond a reasonable doubt', and 'universally certain' means that in any situation in which a reasoning being could possibly exist, the methods of logic would still yield certain results.
For example, the statement "if A is true, and if it is true that A implies B, then B must be true" is of universally certain truth, and formal. Furthermore, the statement says that a certain method of reasoning, that is, to assume that B is true if you already know that A is true and that A implies B, is valid. So, this statement is both logically true itself, and also of particular interest to logic because it asserts the validity of a reasoning method.
If we expand our view from universally certain, formal reasoning to all universally certain, formal truth, we get the topic of rigorous mathematics, construed as truth about which one can write formal proofs. This is more a discipline than a topic; the discipline is that you may only use logical proofs.
If we expand our view to all universally certain truth, we get rigorous philosophy. At this point we admit patterns of reasoning such as, "I think, therefore I am," which has universally certain validity which seems to stem from their content rather than from their form.
Now we expand to universal truth. This covers much of the wide-ranging subject of philosophy. However there are parts of philosophy which are concerned with issues particular to humans, rather than applicable to any reasoning being.
Next we'll turn our attention to the experimental sciences. Relax universality to things which merely seem to be true everywhere in existence, but which seem to be contingently or empirically true, as opposed to necessarily true. By empirically we mean that if you were a mind floating around without a body and without any senses, you would not be able to convince yourself of these things; more precisely, you need empirical evidence from your senses in order to decide if these things seem to be true. We will also add the requirement, the requirement of experimental science: that we only consider falsifiable hypotheses, that is, hypotheses for which some experiment can be imagined which could in principal falsify the hypothesis.
The study of falsifiable hypotheses for truths which are thought to hold throughout existence is physics.
Part of physics is chemistry.
Biology has some content that is applicable to anywhere in this universe (for example, the theory of evolution), but for the most part it studies truths which are only applicable to specific types of life out of all of the theoretically possible types, namely those types which are known to exist or to have existed; and since we've only explored close to home, right now that restricts us to this solar system, and as of this writing, even to this planet.
Neuroscience is part of biology.
Geology is also for the most part confined to this planet.
At this point we turn to non-experimental sciences, that is, topics for which experiments cannot feasibly be done.
Economics is a mixture of mathematics, with an in-principal experimental science, with a non-experimental science (since no one really has the time or the money or the ethical license to do most of the experiments).
There are also 'hard sciences' and 'soft sciences'. Hard sciences are (my definition, not everyone thinks of it this way) those in which the objects of study can in principal be directly measured in a non-controversial fashion. Soft sciences are the others, those in which there is considerable disagreement on just what the objects of study really are and how to assess them. For example, in geology they study rocks. This is a hard science. You can weigh a rock, you can do chemistry on it to see what it's made of. It's clear what rocks are (a collection of molecules stuck together) and how they could be broken into well-defined parts (molecules).
By contrast, in psychology they study people's minds. People still argue about what a mind is. There are various things that are thought to be parts of a mind (thoughts, emotions, perceptions, memory, motivation, consciousness, personality) but it's arguable whether any such list is the best way to think of the composition of a mind, and no one can say if such a list is comprehensive. Because it's arguable what is being studied, assessment methods are also controversial; does X experiment really measure motiviation, or personality? Should we ask people about what they are thinking and feeling, or is that too subjective? How do we do experiments on emotions?
History is often considered not even a science because you can't actually observe the past, even in principal. It's topic is also even more provincial than biology; it only considers humans. It is also soft; although whether or not someone was in a certain place on a certain day is pretty objective, history also concerns itself with subjective issues like which countries were most powerful, and questions of why, such as, why did country X, which was so powerful in year Y, become less powerful afterwards?
Now we turn from investigation of truth to the creation of useful methods. We have engineering, which is the rearrangement of nature and the creation of artifacts using technology. Note that most branches of engineering are paired with a theoretical side, usually involving physics or mathematics or both, as well as an applied side; so we have things like the mathematical theory of computation and of algorithms twinned with the search for useful computing constructs, and we have the theory of linear signals and systems, and control theory, and information theory, twinned with the applications to electrical engineering. Like the hard sciences, the stuff being studied usually admits non-controversial definitions and measurements, and success or failure of work is often even less controversial than in the hard sciences. However, unlike science, the topic is not just what is actually true but the much wider domain of what we could hypothetically do that would be useful.
Finally, we have the humanities. Here we are still provincially restricting our attention to humans. Here we are again like the soft sciences in that the nature of what is being studied, and how to assess it, and how to assess success or failure of work, is itself controversial. However, despite this methodological difficulty the humanities are still worked in because the topics at hand are so important to us.
There are plenty of interdisciplinary areas too. There is cognitive studies, which includes bits of psychology, neuroscience, a.i., philosophy of mind, philosophy of rationality, metalogic, linguistics, and other stuff.
There is the design of programming languages which, in a manner similar to architecture, is part mathematical theory, part philosophy, part humanities, and part engineering (and possibly part art but i don't want to make anyone angry).
The definitions given in this section (and in the rest of this book) are not very carefully thought out. They are 'naive' or 'folk' definitions, as the philosophers say, and probably often wrong. Also i don't mean to indicate that the topics i mention here are the only or the most important ones.
Some of this has already been covered in the above, in 'types of topics'.
There's philosophy. What, if any, is or should be the discipline of philosophy has not afaict reached consensus. There is are in fact large philosophical subareas devoted to related questions, eg epistemology
There's the discipline of mathematics; that which can be proven. By 'proof', mathematics means (afaict), a comprehensible argument for an assertion such that, any reasonable human, if they knew and understood the argument, would be absolutely and conclusively convinced of the assertion. An attempt to formalize a method of proof into symbolic form is called a 'formal symbolic logic' or just a 'logic'. There is a single family of logics, usually now just referred to as 'formal logic' or 'symbolic logic' or 'classical logic', which is by far the most used, and in which all or almost all contemporary mathematical proofs can be phrased. However is it unclear if this family of logic is or can be formally extended to encompass all 'proofs' as defined in this paragraph, and there are some metamathematical theorems which give cause for doubt (eg Godel's incompleteness theorem, see below). If it is not, then the questions arise (a) whether there is any single formal symbolic logic which is equivalent to 'proof' as defined in this paragraph, and (b) whether there exist 'proofs' which would not be considered conclusive and convincing to a human, but which would to some other superior kind of rational mind (beyond just those proofs which are too long for humans to get through). Issues (a) and (b) have not afaict reached consensus.
There's science. Exactly what the discipline of science is is an evolving questions: see https://en.wikipedia.org/wiki/Demarcation_problem and https://en.wikipedia.org/wiki/Scientific_method . Note that historically, the word 'science' was used to mean related but different things to what it is used to mean today. Eg in one paper, [3], it seems to me that when the author says 'science', they may mean "a relatively reliable, data-based, orderly method for producing and organizing knowledge".
The slight differences in meaning of synonyms in other languages confuse the issue, too (eg English 'science' vs German 'wissenschaft'). Some potential criteria for what is 'science' include:
some think there is no definition, eg "Feyerabend argued that science does not in fact occupy a special place in terms of either its logic or method, and no claim to special authority made by scientists can be upheld. He argued that, within the history of scientific practice, no rule or method can be found that has not been violated or circumvented at some point in order to advance scientific knowledge. " [14]
perhaps a definition of the older sense of the word: "Aristotle described at length what was involved in having scientific knowledge of something. To be scientific, he said, one must deal with causes, one must use logical demonstration, and one must identify the universals which 'inhere' in the particulars of sense. But above all, to have science one must have apodictic certainty. It is the last feature which, for Aristotle, most clearly distinguished the scientific way of knowing." -- (Larry Laudan, Physics, Philosophy, and Psychoanalysis, "The Demise of the Demarcation Problem", quoted in [15])
(some criteria stated as negations:)
(tangential note: [23] also poses another important problem, "the relationship between science and reliable non-scientific knowledge (for instance everyday knowledge),") (many of the items in the bullet list above are quotes or paraphrases)
my take:
Note that science is not as objective as it may seem. In real data, there are often outliers that should be removed during analysis, and these are not always reported. In addition, there are many sorts of experimental apparatus which are 'tricky'; so there are many data-gathering sessions in which it seems clear to the experimenter that something is not working properly, so that data is thrown out and not reported; however this decision is made on an ad-hoc and subjective manner.
Furthermore, there are a lot of negative results which are never reported (non-reporting is not only due to flawed incentives; it is difficult to know if your experimental apparatus is working properly (and if your other methodology is right) unless you get a 'strong signal' one way or the other; and many experiments can in any case only give a 'strong signal' in one direction; so if the result goes another way, there is real reason to suspect that it may be meaningless).
Furthermore, there is a lot of bad blood between academics. It is a commonplace that someone who is your competitor is likely to do everything they can to argue against you should they end up on a committee with some power over you (eg grant review or paper review). There are tons of stories of professors abusing their position on grant review or paper review committees to hold up the other person's work while they rush their own work on the same topic to completion and get it published first. A person i met who organized an academic conference reported that as he invited each professor, very many of them replied with "I'd love to attend but only if none of the following people will be invited: a, b, c, d, ..."
" Big topics were scientific engineering programming, numerical analysis, especially numerical linear algebra, standard applied statistics -- hypothesis testing, linear statistics, analysis of variance, curve fitting, etc. -- the fast Fourier transform, digital filtering, power spectral estimation, deterministic optimal control theory, linear and non-linear programming, discrete and continuous time stochastic processes, Monte-Carlo, numerical solutions of ordinary and partial differential equations, etc.
For some of what can be done with applications, there is a long dessert buffet from the Hahn-Banach theorem and more in D. Luenberger, Optimization by Vector Space Methods. One heck of a toolkit.
...
Here's another possibility: Attack integer linear programming problems in business. Focus on the problems that can attack with min cost flows on networks with integer arc capacities. Why? Because it is just linear programming; the simplex algorithm becomes really simple in that case; in practice the simplex algorithm is fast beyond belief on problems large beyond belief; and get integer programming for no extra effort -- if start with an integer basic feasible solution, then the simplex algorithm will maintain an integer solution and, if there is an optimal solution then will get an integer optimal solution. Can use this in various resource allocation problems. For more, use it as a linear approximation in nonlinear problems and for still more enhance it with some Lagrangian relaxation.
" -- https://news.ycombinator.com/item?id=9228758
---
https://en.wikipedia.org/wiki/Portal:Contents/Outline_of_knowledge