I’m reading Sean Carroll’s new book, The Big Picture (it’s very good; I’ll provide a review when I’m done), and once again I got balled up about the difference between Einstein’s General Theory of Relativity and his Special Theory of Relativity. I can never get them straight, no matter how many times I look them up—just like I used to confuse the historical difference between Sunni and Shia Islam (I can now remember that one).
When I got confused this time, I thought, “Maybe a physicist would consider this difference something that every educated person should know.” And then I thought, “What would I want every educated person to know about my own area of study—evolution?” Well, there’s a lot I’d like people to know, like what the evidence is for evolution (that’s why I wrote WEIT), but if I had to summarize what I’d want people to know in just one paragraph, I suppose I’d say something like this. (Nit-pickers: I wrote this in a few minutes and haven’t gone over it obsessively.)
There are five parts to the “Darwinian theory of evolution”. First, evolution happens: that is, populations are genetically transformed over time. That means that the genetic constitution of a population changes from one generation to the next, not that individuals themselves change genetically. Second, that change of populations is gradual: substantial evolutionary transformation, like the evolution of bony fish into amphibians, takes thousands to millions of years. Third, evolution involves more than just transformation of populations, but splitting of populations—what we call “speciation.” One lineage can divide into two or more lineages that can’t exchange genes with each other; and those new lineages can themselves split. This produced the “tree of life” that, starting with one ancestral species about 4 billion years ago, produced the millions of species living today as well as the millions that have gone extinct without issue. Fourth, if you look at the splitting process in reverse, starting with any two twigs (species) on the evolutionary tree, you can, if you go back in time, find a common ancestor those species, just as if you take any two twigs on a tree and move down, you’ll find a common branch or node that they share. All living things are thus related, and the more recently their common ancestor lived, the more closely related they are (that’s the definition of “closely related”). Finally, the “designoid” features of organisms—the features that make them look so well adapted to their environments and lifestyles—are the product of natural selection: the combination of a random process, mutation, that generates genetic variation without regard to whether it’s “useful” or not, and a deterministic process, selection, that winnows the variation by retaining those mutations that are better able to make copies of themselves and eliminating the worse copiers. There are other important processes of evolutionary change, like random genetic drift, but only selection can produce the design-like features that so excite our wonder. And we have strong evidence for every one of these assertions, so that the “theory” of evolution is “true” not only in the sense that it’s the best explanation we have for how life changed on Earth, but also because we have copious evidence from many areas of biology supporting all five contentions.
That’s pretty much the way Darwin laid out the theory, though of course he knew nothing about genetics or “random” mutations. And, by and large, these propositions still stand up today. There are some exceptions of course: the origins of mitochondria in cells didn’t involve just gradual change of a single species, but the integration of one species with another, somewhat blurring the “treeness” of life. We also know a lot more about the process now than we did when Darwin limned it in 1859. But the paragraph above is what I’d expect anyone who considers themselves educated to know about evolution.
Not all readers are academics, of course, but most of you have fields in which you work and, presumably, would like others to know what’s important about that field. If you feel so inclined, write a sentence or paragraph about your “area”, and what people should know about it to be considered “educated.” If there’s too much to say, just pick out one thing to highlight—perhaps a fact or misconception.
272 thoughts on “What should people know about your field to be considered “educated”?”
Heh heh. Not “IT” people work on the Service Desk and our knowledge is not transferrable. So, you can’t come to me with a network problem and expect me to know how to solve it. I need to go get a network person the help with that….I’ll understand the basics but I won’t be able to fix your issue like an expert would.
I work in IT too Diana, as a BA (Business-Analyst). A typical assumption of lay-people is that everyone in IT is a techie e.g. as you say, everyone assumes we can all fix network problems (I can’t – I can only just spell network).
I used to have the same conversation over and over with my mother, trying to explain what a BA does. So, in summary:
Computer system development does not involve only coding/programming. In the good old days (before agile – look it up on Wikipedia folks, not my bag) we described the end-to-end process as a sequence of project stages: business requirements definition, design, coding, testing, training, implementation.
Each of these stages can break down further into sub-stages e.g. testing comprises unit-testing (at the program unit level), system and integration testing (end-to-end testing that it all works as designed), and UAT (user-acceptance testing, that it satisfies the business-requirements).
A BA helps clarify, document, and confirm business-requirements i.e. determine what the business needs are in business terms, focusing on the ‘what’ not the ‘how’.
Additionally, although this differs from company to company, a BA may also review designs to confirm that all business-requirements have been addressed, and also help plan and define UAT,
I have been a BSA on and off and I’m now an IT Project Manager running Agile projects (essentially a scrum master). Agile really is no different in that all those stages still happen, they are just iterative. I describe it as a layer cake – you do a bit if all stages in each sprint to give your end user a taste of the whole.
I also have been a systems analyst, process improvement specialist (using 6 sigma) and web developer. My parents have never understood what I do. They just say I work in IT.
That’s a good mix of experience you have Diana, must add to your abilities as a project manager I reckon.
Interesting you feel agile is just an iterative version of the old-school sequential ‘waterfall’ approach – I was effectively pushed out of a job a few years ago because the company adopted agile (I felt it was really just the latest silver-bullet solution to the recurring failure/expense of IT projects, flavour-of-the-month for management consultants to rip companies off). It was argued that BAs were no longer needed, because APs (Analyst Programmers) could cut out the middle-man and work directly with the business-users to prototype solutions, coding on the fly.
I think that approach has its place, horses for courses, but sometimes the development of a full integrated scope of business requirements is essential before proceeding to design never-mind actual coding,
That’s a terrible way to implement Agile. BAs always are needed, especially in bigger projects. I’m running a gigantic upgrade project right now with 3 BSAs and 2 developers.
The thing with Agile is to fail fast. You want to structure things so that you can be flexible as well and handle changing requirements and business needs. It’s not as easy as just slapping it in.
Oh I should add that requirements are often complete before going to design it is just that you are making progress by designing portions of the whole without waiting until every requirement detail is known. Often you can complete one part while still working on the requirements of another. It’s a whole process that can be adapted for the circumstance. Where it doesn’t work well is infrastructure because your requirements can’t be independent.
Also, as a project manager, I often think of my developer friends, including our own Ben Goren, who complain about them sometimes but I see myself as an enabler. My job is to get stuff to done as efficiently as possible so primarily that means helping the team estimate realistically, keep distractions away from the team, help solve impediments and make shield the team from other noise that inevitably occurs when you work with people.
The last project I was on, we had no BAs on the project. This was for two reasons:
First, the project owner (and his number 2) were doing that role as part of being a product owner. What it meant was that he’d visit the team several times a day to talk through particular issues with cards.
Second, on previous iterations of the project that were done Waterfall style, they had really bad experiences with BAs.
On the current project I’m on, we have a BA for the upcoming work, but present work is being sorted out between myself (SM) and the Product Owner.
You’d need some sort of SME who is empowered to make decisions about what needs to be done. We’re happy for the Product Owner to fill that role when the Product Owner is capable. But we are acknowledging that the Product Owner is acting in the role of BA as part of their duties.
Agile must have evolved from Barry Boehm’s Spiral development model from the 1980s – which is basically iteration over the basic phases of software development. I was in the business long ago and we incorporated some of his ideas ad hoc.
I think parents not understanding their offspring’s profession may be the norm.
My colleagues here at work who develop the code that runs our devices use a management system based on Sprints (is this the essence of “agile”? — I don’t know the correct words)
Here, it is applied to code functions: In the Sprint we will complete functions X, Y, and Z.
This seems to be very effective for them and helps prevent scope creep.
Think of Agile Development as a philosophy which has multiple implementations. Scrum is one of those implementations which uses the term Sprint to denote a time-boxed work iteration – usually 2 to 4 weeks long.
Software developers usually take well to Agile because they hate waiting forever until requirements are complete.
Last job before my current one, I ended up sitting around for months for requirements to be complete, then sitting around for months while the system was being tested. Then sitting around for months after the release while they argued over fine details of specifications before development could even begin.
A sprint is a period of time in which you theoretically could deliver working software (if you’re developing software) when you are working in a flavour of Agile called Scrum (and maybe other flavours). The team agrees on the length of the sprint during a sprint planning session. You don’t want it too long because if you make a mistake you want to catch it early. You don’t want it too short because then you have more planning to do so more admin work. I find 2 weeks has worked well but I’ve sometimes extended the sprint out a week for various reasons.
The sprint planning involves understands the sprint goals then breaking down epics (high level requirements) and user stories (more detailed requirements) into tasks. Epics span sprints, user stories can be completed in a sprint. Tasks can typically be done in a day. Completing a user story should deliver value to your customer. The time it takes for a customer to receive value is lead time (this is more a Lean concept). You also estimate your user stories and identify resource allocation at each sprint. Your product owner s d team are all present in these meetings. It’s highly collaborative.
It’s a lot to discuss with Agile but most companies use it now because they need to be nimble in releasing product. Even more conservative places like banks and insurance companies are using Agile.
“Sprint” “Agile” “Scrum” – do you have an Olympics team?
I think all the terms come from rugby or some other sport ball game.
Thanks very much for this detailed reply Diana. This makes great sense and fits well with how I see my SW colleagues working.
Our sprints here typically (from the scheduling I’ve seen, last about 4-6 weeks.
This is typically feature development for medical devices and I think they do a lot of internal testing before turning the SW image over to others for system level testing. This may be the reason for the length of the sprints. Then again, we might just be slow! :0
I just started as a scrum master a month ago, so I’m trying to learn on the job. Never thought I’d be in a role where my job is being a people person – was happy just to write software.
LOL, 30+ years as a systems admin VM/XA, MVS, Z/OS, AIX, HPUX, Linux…
People don’t understand when I tell them I can’t fix their PC.
I don’t do Windows. 🙂
I’m very lucky. My wife does my windows for me. She happens to be a material scientist who can’t program, but she’s great with a circuit board. 😎
I keep telling my wife “I don’t do Windows”; she keeps sending me out there with a squeegee anyway.
Please listen to Joe Brown’s recording of “When I’m Cleaning Windows”, I think you’ll love it. (That whole record, “The Ukulele Album” is excellent.)
Your probably thinking of George Formby.
Joe Brown does a cover of that song. (Obviously a natural for a ukulele album.)
Formby’s version is excellent too! I see from the Wikiness that Formby wrote it and he had the distinction to have it banned form the BBC!
Those are both great!
I didn’t know Formby wrote it or managed to have it banned. Interesting.
Van Morrison also did a great tune titled “Cleaning Windows.”
Sounds familiar. I was NOS/BE, MVS, VM and later AIX, HP-UX and I forget what other flavors. Retired now, I’m Linux and my wife — most unfortunately — is Windows. Oh, well…
Perhaps she’ll switch rather than use Windows 10.
She is already using the eternally-updating W10. And we have a slow (2Mbps) network connection. Sigh.
I installed Never10 and it prevents all the Win 10 shenanigans. I’m using Win 7.
My computer is Win&, I love the system. My computer is a work station, not an entertainment station, so Win 8, Win 10, Vista, et al. were right out.
I’m sure Linux users feel similarly.
I’m quite happy in Win 7 (or XP). I used to use UNIX at work (along with lots of various forgotten SW). I’m reasonably comfortable in Mac OS. I was mac-only until Win 95. After that, the Win system was fine and for compatibility reasons (friends, work) I went over to the dark side.
I run Linux at home (and, infrequently, WinXP), I used to have to run Windows at work (till I retired). I quite agree WinXP was the best version. When I got upgraded to Win8 (I think it was) the first thing I found out how to do was switch to the ‘classic’ desktop.
Win8 looked like it was designed for a smartphone.
(Similarly in Linux, the latest Gnome desktop seems to have followed suit, I switch to ‘Gnome Classic’ or one of the several more traditional-looking desktops available).
“I don’t do Windows.”
Me neither. Which is to say, my home setup is exclusively various flavours of Linux.
I had to use Windoze at work, but since IT would never let us users fiddle with the works, I know nothing useful of Windows after Win98.
Then my father, who lives the other side of town, would phone me up any time he broke something on his Windows setup and expect me to tell him what to do. Over the phone. When I didn’t have a working Windows to crib off…
“business requirements definition”
This is what all my IT (coding) friends complain never gets done well or in a timely manner.
Developers build interesting code that nobody knows how to use, BAs want interesting services that nobody can code.
Good one! 🙂
That’s why working in an Agike team works. When I taught Agile to BAs I told them that in Agile you are no longer alone. You are no longer solely responsible for requirements. That is the job of the team to understand them but a BA may lead initiatives to refine those requirements with the team and product owner (business).
I try to tell people that I’m the IT equivalent of a General Practitioner, and while I have some specialist knowledge, for the most part I am not (the equivalent of) a combined transplant surgeon, pathologist, neurologist, gynaecologist and a zillion other specialities.
I tell my colleagues that I can write a sensible referral to any of these people, but if you want me to perform their specializations, you are going to need to give me time and a whole lot more money.
And even though most of my colleagues are specialists in their own disciplines, they still find it unacceptable that an IT person doesn’t know everything about that very wide field.
“Special relativity” is the special case where accelerations are zero, and thus relative velocities are constant.
“General relativity” is the more general case where the acceleration can be anything.
I do not think this is entirely accurate. SR includes accelerating frames by using the so-called clock postulate. GR postulates that an accelerating frame is the same as a gravitational frame. I have always liked John Wheeler’s two line description of GR:
‘Matter tells space how to curve.
Space tells matter how to move.’
Although Wheeler should have said space-time rather than space.
One very late, very drunken night out in a bar, a friend introduced me to another friend of his by saying, “This guy’s a crazy genius. Ask him anything about anything.” I said, “Explain the difference between general relativity and special relativity in ten words or less.” I don’t remember his exact answer, but he passed the test with flying colors. I really wish I could remember what he said. It was perfect.
Gee, I wish you could.
Did you get his number? 😎
As PuffHo would say, “Drunken savant has genius response to question about the theory of relativity.”
Bam! Hahahahahaaha! Good one!
Practicing in PuffHo style has humorous payoff 🙂
When I read the statement
“Special relativity” is the special case where accelerations are zero, and thus relative velocities are constant.
it seems to me that it means that Special Relativity cannot deal with problems involving accelerated bodies. I think that is what every layman will understand, and it is a pity, because it is not true.
A lot of problems involving accelerated objects are treated in the framework of Special Relativity.
I think the sense in which your statements are true can only be grasped if the statements come with a large side of explanation.
Special relativity does deal with acceleratED bodies (ones moving with a constant relative velocity), but not with acceleratING bodies (where the relative velocity is changing).
No. In SR you deal with an acceleratING object using a momentarily comoving reference frame. It is explained in any relativity textbook. See here.
You are both right, I think.
Special relativity is traditionally based on non-accelerating (inertial) frames to assert the same (invariant) laws. Physically it asserts its basis is exclusively non-accelerating frames by using Lorentz transformations.
But of course as physics it can be used for sundry systems if one is careful.
On the other hand I would say that general relativity is based on introducing spacetime curvature. “Special” relativity applies for negligible curvature (as noted in other comments).
That GR then encompasses accelerating frames is natural, but not explicitly its basis. [But I haven’t studied GR, so what do I know.)
The generalisation to accelerating frames was certainly part of Einstein’s motivations. This is Einstein from a Nobel Prize lecture in 1921 (link):
“In common with classical mechanics the special relativity theory favours certain states of motion – namely those of the inertial frames [“inertial frame” = not accelerating] – to all other states of motion. This was actually more difficult to tolerate than the preference for a single state of motion as in the case of the theory of light with a stationary ether, for this imagined a real reason for the preference, i.e. the light ether. A theory which from the outset prefers no state of motion should appear more satisfactory […]
“The conclusion is obvious that any arbitrarily moved frame of reference [= allowing accelerations] is equivalent to any other for the formulation of the laws of Nature, that there are thus no physically preferred states of motion at all in respect of regions of finite extension (general relativity principle).”
Of course Einstein then argued that acceleration is equivalent to gravity (equivalence principle), and thus owing to the generalisation his theory turned into a theory of gravity.
Thus the statement that SR restricts to zero acceleration whereas GR generalises to include acceleration is equivalent to saying that SR is effectively a zero-gravity treatment whereas GR includes gravity.
Well, you lost me.
To take an example : there is a relativistic treatment or cyclotron radiation, is there not (due to Schwinger, if I am not mistaken) ?
In that problem, there are objects with changing relative velocities, is there not ?
That does not require General Relativity Theory, does it ?
Am I completely deluded on these matters ?
The whole formalism of Special Relativity starts by assuming inertial frames (a frame in which an accelerometer would read zero).
General Relativity, in contrast, is all about accelerating frames.
Now, there are instances, as you point to, where accelerations are sufficiently small or orthogonal to the main motion that one can usefully use a SR formalism to analyse them.
But I’m still sticking to my basic point that SR is all about how things appear in inertial frames and GR about how things appear in non-inertial frames.
Thank you for that answer.
So, if I understand well, a relativistic treatment of cyclotron radiation is possible within the framework of Special Relativity only because the acceleration is normal to the motion ?
I think of it simplistically: special relativity involves light, general relativity involves gravity. (500,000 foot view)
I tend to remember that “G”eneral Relativity starts with a “G”, just like “G”ravity, and go from there. Soon, I recall that general relativity explains gravity as a result of the curvature of space/time.
There once was a lady named Bright
whose speed was much faster than light
She set off one day
in a relative way
and returned home the previous night.
Reminds me that Isaac Asimov wrote (and collected) limericks both scientific and bawdy.
Special relativity is about flat mostly empty space. General relativity is the more general case where space is distorted by gravity.
General relativity is special relativity modifyed to include gravity.
This statement is pretty much equivalent to my statement since, by the Equivalence Principle, gravity is equivalent to acceleration. Thus “zero gravity” and “zero acceleration” amount to the same thing.
I think it is more pertinent simply to know that special relativity is about the speed of light (altho that’s almost an accident, it’s really about the ultimate speed limit and light just happens to have that speed .. we think), whereas general relativiyy is about gravity. so S for speed and G for gravity. Why they are called special and general is something I have never understood.
Sorry, but “special relativity only applies to zero acceleration” is completely false, and a pernicious misconception that deserves to be stamped out.
In ST, non-accelerating (inertial) motion is “preferred” in the technical sense that everyone agrees on whether motion is accelerated or non-accelerated. But that doesn’t mean acceleration doesn’t exist, or isn’t described by the theory. There is no problem whatsoever with dealing with accelerated motion in SR.
The single difference between special and general relativity is that GR describes curved spacetime (and therefore gravity) and SR describes flat spacetime (and therefore no gravity).
Thanks! Straight from the horse’s mouth.
Annie Hall moment.
… and you are?
Thank you. That should settle the matter.
Since an expert has piped up, I have a very lay- layperson’s question I’ve been greatly wanting to ask. My understanding is that the notion of spacetime didn’t originate with Einstein, but with the mathematician Minkowski, who provided it as a semantic interpretation for aspects of GR, and that Einstein agreed with this interpretation. Is this so? And if so, how might Einstein have viewed all this before the label “spacetime” got coined? Was it all just equations waiting for a label to bring them to life?
Einstein eventually accepted Minkowski’s four dimension vector space formulation of SR but at first he was rather dismissive. Perhaps it was because Minkowski was Einstein’s math professor and once called him a lazy dog. It is a good job Einstein changed his mind because Minkowski’s spacetime was an important step toward GR.
This is all above my head, as it were . . .
Is anything not in motion?
Is the “bending” of space-time an analogy?
Since there is no up or down in the universe how is space-time “bent” in the absence of a mass of matter?
What’s the matter?
Does it matter?
The mnemonic I use to keep them straight is:
SR is E = mc^2
GR is space-time curvature
Clever code is bad code.
But clear clever code is genius.
But remember, once the genius leaves the project some novice will have to maintain the damned thing. 8-(
Spaghetti code is bad code.
Buggy code is bad code.
From that we learn that having bugs in the spaghetti is a disaster.
Code that is hard coded and customized is also bad code. The code may be nicely written but impossible to maintain.
I’d have to disagree with you there. I wrote / maintain a stock management / ordering / invoicing program for a friend who imports small-volume car parts (it started as a hack so he could read .DBF format parts catalogs). It is therefore 100% customised (though obviously some routines are re-used where possible). I’ve thought of generalising some of it but that usually adds to the size and complexity, rather than reducing it.
Admittedly this is a special circumstance in that the program is never going to be used by anyone else. But my point is, the customisation as such does *not* make it any harder to maintain or modify.
I’m thinking more a long the lines of systems that have perfectably acceptable out-of-the-box code that can be configured but instead people customize it because a business user wants something to do something differently. Often this “something” will be addressed in later releases anyway. Examples include SAP (you can really make a mess of it) and various Service Desk applications that can be stretched to do things they were never intended to do.
Yes I appreciate that you were talking about ready-made programs.
If the customer really wants something different, though, that suggests they maybe should have bought a different application in the first place.
I have suffered from mis-applied marginally suitable applications that were inflicted on us users because some other part of the organisation was using something apparently (but not really) similar already and IT didn’t want the bother of evaluating and installing a different application.
I also noted with gloomy cynical satisfaction that most of the much-hyped new computer-based systems they introduced were fiascos that were quietly forgotten about after a couple of years… Not, I think, the fault of the developers so much as management thinking they could solve everything by introducing yet another system…
Introducing yet another system is really just another version of GIGO. Often people don’t know how to fix their malfunctioning processes so they simply add to the mess.
I most thoroughly agree. Rather than fixing what wasn’t working, they just added another layer of complication to the mess.
I recall the author of ‘Up the Organisation’ saying there should be a Vice-President in Charge of Killing Things whose job was to get rid of all the marginally useful but, en masse, stifling procedures that just naturally grow in any big organisation. He was right.
Places that embrace continuous improvement often have area dealing with fixing such things but sadly, most don’t want to bother. My real area is improvement as I love to solve problems and I hope to return to that line of work one day.
“Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?” — *Elements of Programming Style*, Kernighan & Plauger
I can most fervently echo that!
I think it’s crucial to understand the existential risks unique to the 21st century. Just 71 years ago, there were a small handful of improbable risks that haunted our species — supervolcanoes, asteroid/comet impacts, global pandemics, and so on. This century, though, there are far more than this. Climate change, biodiversity loss (the sixth mass extinction), nuclear weapons, biotechnology, synthetic biology, nanotechnology, and even artificial superintelligence. While advanced technologies could enable us to neutralize some of the risks from nature, they’re simultaneously introducing more risk scenarios than our species has ever before encountered in its 200,000-year existence. Furthermore, not only is technology becoming exponentially more powerful, but fields like biotech, synthetic biology, and nanotechnology are becoming increasingly accessible as well. This is incredibly worrisome because the squishy computers behind our eyes haven’t changed much since we emerged from the Paleolithic. by 2050, there will be some 93 million psychopaths in the world and nearly 300 million sociopaths. And if Pew is correct and there are 2.76 billion Muslims by the middle of this century, there will be about 93 million jihadists roaming the planet. If a doomsday button were to become widely available — or even more available than it currently is — we could be headed for a catastrophe of genuinely existential proportions. There is, indeed, a reason why experts like Sir Martin Rees give civilization a fifty-fifty chance of making it to the 22nd century. (Others who’ve given similar estimates are Nick Bostrom, John Leslie, Richard Posner, myself, and arguably Stephen Hawking, Elon Musk, etc.)
See http://www.xrisksinstitute.com for more.
(My apologies for typos.)
While I did not know the details, some of the risks you mention have been floating in my mind, and I suspect in the minds of many. I would have to guess it’s having an impact in producing a sense of general anxiety and dread in some.
Probably, risk reached a peek during my adolescence during the cold war threat of nuclear confrontation. Specifically the Cuban missile crisis. Then risk would have declined until some of the newer threats you point to here became more prominent. I suspect risk will develop, not as a gradual ramping up of bad possibilities, but a series of peaks and valleys.
In any event, if some disaster happens, I want it to be after my time.
Thanks very much for the work you do. It might be the key to our survival as a species.
My impression is that risk of nuclear war didn’t fall monotonically, if at all between the missile crisis and 1990. It may have increased due to increasing reliance on multi warhead missiles and less on manned bombers, leading to sophisticated counter-force strategies and hair-trigger launch-on-warning scenarios.
Compared with the cold war decades, we are certainly in a valley at the moment, but for a number of reasons, I fear the risk will rise in future — here I will resist citing details of current leaders and would-be leaders’ thought!
I suppose that there are thoughtful scenarios getting out of the nuclear trap, but I’ve not come across one… perhaps just about all technological societies really do self destruct.
Of course changes introduce new risk, but they also mitigate old risk. (Global starvation, poverty and wars soon eradicated, et cetera.)
Our brain-body system has evolved immensely since “modern humans” split and later merged back with larger brained humans like Neanderthals. Our brain has shrunk, our intelligence is shooting up, some can tolerate lactose, et cetera.
Risk can be handled well if it is understood. Do these people understand society and its development, or are they just the same irrelevant futurists that we have seen every generation for, oh, maybe at least 200 kyrs?
It’s true that technology can lead to the reduction of certain risks. But virtually every scholar in the field believes that, on balance, the overall risk today is greater than it’s ever been before (by a long shot). And yes, risks can be handled if understood. The problem is that (a) there are very few scholars actually studying existential risks (more papers published on dung beetles than x-risks), and (b) many of the greatest risks to our species’ prosperity and survival are new. Never before has humanity encountered anthropogenic climate change, nuclear winters, engineered pandemics, self-replicating nanobots, and superintelligent machines. And no, the relevant scholars aren’t “irrelevant futurists.”
Don’t forget that just because we talk more about climate change, over-population or biotechnology — to name only a few subjects — does not elude the fact that the nuclear problem is not only still there, it’s got worse.
I was in IT for 40 years (degree in Math&CompSci). If there’s one thing people really need to know about computing anything it is:
garbage in = garbage out
especially in the context of computer models.
I think most people are familiar with the GIGO principle. I’d suggest the problem lies in objectively identifying what constitutes “garbage”.
With regard to my summary below of the life of a BA (Business Analyst), I’d define any system that does not satisfy business requirements as garbage,
All my friends and colleagues in IT (at least those designing code) always have one cry: “Give me good requirements!”
It seems that almost all customers have a very hard time defining what is required of the SW.
I want my developers involve in the requirements. I don’t like being the middleman. I can help facilitate and I have a good understanding of the system to do so but I never want to just hand over requirements. I want the developer input and I want the business user to hear it.
I used to say that our customers nver knew what they wanted in a system until they didn’t see it. Good requirements depend on the system user having thought through their needs with some clarity. This doesn’t happen until the user has spent a deal of time with the developer to define requirements in an iterative process. In truth the requirements will probably never be completely defined leading to a constant need for maintenance and changes throughout the system’s life after implementation as new requirements are introduced and old ones abandoned. Only when you show the users a prototype, however thin, do they really start to think about ‘the system’.
This situauion will probably endure until hardware technology is able to implement the Maybe-gate and software the “Perhaps-statement”.
As for GIGO: I’ve worked with many including scientists who didn’t get that their models were algorithmically incomplete, missing complex processes entirely with flawed implementations of their best (wrong) guesses at parameter values and insisted that the resulting model predictions were totally reliable. There are fields where this is an ongoing major problem; not everybody gets GIGO.
That’s a very good point. Ask people to define their requirements in the abstract and they won’t have a clue what they need and the ‘nice-to-have’s will bury the essentials.
Let them use an actual thing for a while and they’ll discover all sorts of necessary features that were never mentioned in the original specifications.
This is, after all, why manufacturers have to exhaustively road-test new car models.
That’s what I like about Agile. It’s part of the Agile Manifesto: “we value customer collaboration over contract negotiation.” Someone said, and I quote it often, “A user story is a placeholder for a conversation.” You need to be talking about your requirements. Even during estimation, you’re flushing out more details and bringing the end user into the process allows them to appreciate what it takes to bring their requests to fruition. It brings a lot of trust into the relationship because they become part of the process (this is a tricky thing to change in some places as the culture in some places is to throw a request over the fence and go on with your own work until it’s thrown back to you).
Yes, that is like a design-build team or integrated product team for hardware, which works well. Everyone is at the table (including manufacturing and post-market support) and has both privileges (input) and responsibilities (they have to own their outputs).
I’ve been involved with these for many years and they work well (overall project timeline, rework, customer satisfaction, etc.)
Here’s an analogy that might help.
The Special Theory is like a map. The General Theory is like a globe.
Thios is actually a very precise analogy. The Special Theory is a “flat”, linear approximation of the GT, just as a map is a flat linear approximation of a globe.
Jerry makes a statement here (that I recall reading in previous posts too) which confuses me: “the genetic constitution of a population changes from one generation to the next, not that individuals themselves change genetically.”
But mutations do happen at the individual level – isn’t this ‘genetic change’? I understand it’s the frequency of the different mutations (alleles?) across the isolated population that ultimately drives evolution and upon which natural-selection acts, but doesn’t evolution start with change at the individual level?
You don’t need any individual changes at all for a shift in the relative frequencies that constitute evolution. If you have 90% A and 10% B at time 1, and 80/20 at time 2, evolution has taken place.
It’s a good point you make, how frequency across a varied population can change without any further mutations at the individual level.
But the variation of A and B must have occurred at the individual level in the past, no?
Mutations occur when an individual is creating a NEW individual. It is inherent in the reproductive process – the transitional exchange BETWEEN a fixed individual and its fixed individual offspring
Yes, but a new mutation in an individual is just a new mutation, it is not evolution. For evolution to occur, the mutation must spread throughout the population.
I was about to comment that mutations do occur between generation, across our genomes we typically have around 150 mutations that are not present in the somatic DNA of our parents – something I’d have thought about adding to Jerry’s list – thus there is a ready driver.
Somatic mutations in individuals (changes occurring in the non-reproductive – egg/sperm – cells of the body)are not passed on to offspring (at least in animals). As such they are not really playing a role in evolution but are important in diseases such as cancer.
I’m not even close to a biologist, but I think what Jerry is saying is that individuals don’t evolve. Yes, a mutation, that is, a departure from the genetic material of one’s progenitors, occurs in an individual. But that is not evolution by itself. That is simply making a flawed copy. Over generations, natural selection acting on those mutations can result in evolution.
Yes, and I suppose I could have written in more clearly. Incorrect “transformational” theories of evolution posit that every individual in a population changes over time, while the correct “variational” theory says that individuals don’t themselves “evolve”, but the population evolves via changes in the frequencies of its constituent genes. Belief in “transformational” evolution is one of the biggest public misconceptions about evolution.
Thanks Jerry. You’ve clarified that individuals don’t evolve, only populations do. But isn’t it the case that variations have to start with changes in individuals? And therefore the evolution of populations ultimately depend on individual change, current or in the past, otherwise how can variation start?
Yes, you’re right, but changes in any individual are a single DNA change. The changes in populations over time can be HUGE.
It seems the ‘over time’ bit is a major public misconception too, an inability to grasp the span of geological-time.
Which seems to be at the heart of the the ‘I accept micro-evolution, but not macro – kinds can’t change into other kinds’.
If you accept micro-changes over geological-time, how come it isn’t obvious that this will, as you say, result in HUGE changes i.e. given long enough, new species.
I have another question I’d appreciate your clarification on: I’ve read in a number of places the idea that evolution can only work on, can only modify, what’s already there e.g. fish fins evolving into limbs. But given the evolution of life ultimately builds from a simple self-replicating prokaryotic cell 4 billion years ago, right through to the ‘creation’ of homo-sapiens 200,000 years ago, doesn’t this inevitably involve steps of extreme mutations that we can view as ‘leaps’ to new features of body-design?
Hmmm… my take on answering this question is this:
small changes (mutations) at the level of the genotype can possibly produce a wide variety of effects at the phenotype level (a large change or perhaps no changes at all). It is wrong to infer that a large change in phenotype necessarily required a large change at genotype. Then it comes down to probabilities – – a sudden large change in phenotype is almost always deleterious. A smaller change of phenotype that has a small adaptive value, but no deleterious aspect, can be “built upon” in further generations by natural selection. So my feelings – “leaps” are almost impossible (the “almost” being still a very very remote possibility)
I leave it to Jerry to say if what I think is actually correct
Your distinction between genotype and phenotype is really helpful. So my phrase ‘extreme mutations’ should probably be ‘extreme phenotypic change’ which may come about by a ‘simple’ genetic mutation.
But my core question still holds: can we consider the phenotypic results of some mutations as ‘leaps’ in evolution? Are these qualitative changes (e.g. the first light sensitive cell as the foundational stage of ‘sight’) rather than quantitative (e.g. stronger muscles that allow an animal to run faster)?
However, maybe some genetic mutations are themselves ‘extreme’ – for example, the fact that different species have different numbers of chromosomes must mean duplication (or reduction) occurred at key stages in the tree of life?
Well, in my view there are no such things as “leaps” – only different mutation-induced changes, some very noticeable and some not. Think of the situation mathematically – a small change “in the right direction” is certainly better that a wild lunge “anywhere”. Then think of functions that demand multiple combined gene changes – just how improbable must that be. Yet a small change in the right direction, and a direction that proves useful in some way, can be built upon. It can even built upon for a different function than the one that was useful in the “first step change”. It’s all driven by probabilities. In math its a form of “hill climbing” and your highly unlikely to get up a hill by making wild “leaps”
I think the word “evolution” implies an effect over time. The “change” in an individual is not something that changes over time. The individual is born with a mutation, relative to its parents. I wouldn’t call simply being born with a mutation “evolution”. Over time (generations) these mutations can effect what the population looks like. That’s evolution.
As a non-expert I’ll hazard an answer (and would appreciate corrections). It’s variation that starts at the individual level through “random” mutation, adding to the variability of the population.
Speciation can happen through genetic isolation of a population, but any population can be expected to go through evolution over time.
The context is the Creationist idea of evolution, who think they are being clever when they insist that nobody has observed how, say, a crocodile transforms into a duck.
I hesitate to speak for my tribe (engineers) but I’ll say what I usually say when someone asks “what do you do?” beyond merely accepting “engineer” or “mechanical engineer” or a combination of “mechanical engineer” and “I work for [or on] X.”
Engineering uses scientific principles to design useful things that people will want to pay for, that are safe, reliable, effective, and cost-efficient. (If any of those characteristics are missing, you’ve done a bad job.)
As I like to summarize: An engineer can do well with one dollar what any fool can do poorly for 2 (or more) dollars.
As a mechanical engineer, the essence of what I do is: I use analysis and testing (there’s a lot packed into those two words) to ensure that the product I’m helping to design doesn’t break. And that its shape, size, materials, weight, form factor, etc. are appropriate for its intended use (there’s a lot packed into that phrase as well).
I currently work on implantable electronic medical devices (in the past I’ve worked one everything from roads to high-tension powerlines to commercial airliners). I often tease the electrical engineers with: It’s an electrical engineering issue until it breaks, then it’s always a mechanical problem!
And another bit of humor: A friend was in a mechanical engineering final exam at university when another student asked the professor, “Can we assume zero friction in the bearings?” To which the professor replied, “No. All the frictionless bearings are in the physics department, down the hallway!”
Do you have a favorite “mathematician, physicist and engineer walk into a bar” joke?
I should, but I don’t! I have a heck of time remembering jokes!
A mathematician, a physicist and an engineer were asked to find the volume of a red rubber ball.
The mathematician derived the relevant equations and evaluated a triple integral.
The physicist immersed it in water and measured the volume displaced.
The engineer looked it up in his red-rubber-ball table.
That is an excellent one!
I lost a job because “everything’s on the internet” and “anyone can do her job”
…. I disagree about this, of course.
For my own field:
Cancer is not a single disease
Even tumors that look similar at a microscopic level do not all behave the same.
If you read about a “miracle cure” on the internet and it’s not available in a respectable academic institution but requires spending a large sum in a third world country, you are being duped.
My field too.
I would add two things.
1) The processes which influence how cancers arise, grow and spread in a person involve many of the same process seen in the evolution of species; mutation which occurs at random (with respect to utility) which followed by selection. Random drift can play a role too.
The selection is for survival of the abnormal cells in the midst of a vigilant immune system. The mutations that occur which drive tumorigenesis are primarily environmentally derived but they can also occur by random replicative processes. They confer on normal cells some change in function – this can be metabolic (the cells become phenotypically different than normal cell types), or they effect survival (the cells become immortal – mutations make it so they do not die when “told” to), or on metastasis (the cells obtain the ability to move) and other tumor functions.
Selection occurs such that those that survive either hide the mutant cell from the immune system -mutations which cause them to be “invisible” to our immune system- or which shut the immune response down. This iterates over time until the cancer cell is free to grow and the person notices a suspicious lump.
2) Sadly most of the time, by the time any patient goes to her doctor because she thinks something is wrong, those processes are far along and that often makes treating the cancer more complicated.
The consciousness raising thing I say to people about cancer is this: It starts from a single cell with a mutation (or series of mutations) and has its own fmaily tree of cells from that one cell.
And therefore every one is unique (though some types tend to behave in typical ways.)
I got in a big fight on FB when a friend posted that cancer was just about making money so that’s why she doesn’t give to that cause. Her “evidence” was that it hadn’t been cured yet and in her experience the treatments were the same 20 years ago.
I ripped a strip off her citing examples in treatment from my own cancer (breast cancer) — she foolishly used my cancer as an example– and then I mentioned about how cancers are not one disease and that mainly the thing they have in common is out of control cell growth. I finished by asking her if she thought all researchers were evilly watching their friends and families die as they secretly got rich.
I think the public understanding what cancer actually is. Would be a huge step forward in medical/pathology literacy.
And in all that long description of evolution, not one mention of its “moral implications!”
I’m not saying one can (or can’t) derive some sort of moral theory by starting from the fact that we evolved as a group-dwelling species. But many popular criticisms of evolution involve complaints about what it’s supposed to be telling us to do, how it’s going to effect our behavior — we’re animals, behave like it! If someone thinks this goal is really at the heart of the theory of evolution, then it ought to be a bit confusing to them when experts present lists of “what you ought to know” without including anything about that.
As for what I think educated people ought to know, I’ll just focus on the very general rule of thumb to “Avoid Stupid.”
How many of the people who say that accepting evolution encourages us to behave like animals have any idea how animals actually behave?
And how many people who say that accepting evolution encourages us to act like animals also fall all over their pets for being more loyal, more loving, more innocent, and just plain better than any human?
Red in tooth and claw, is what they mean, I guess.
Me, I’d rather do like the bonobos do.
My field: retired. Enjoy life, help others and kvell the grandchildren.
“Kill the grandchildren???”
Oh, wait. Never mind.
• There *are* rules
• There is maximum creativity
• Know who you are designing for (yourself or a specific audience) and adjust accordingly.
(P.S. like the design of your comment: nice layout and use of caps, bullets, ‘*’. I wonder if there a name for textual emphasizing markers like you’ve used.)
Why… thank you! It probably does have a name, but I just think of it as ‘generally accepted practices’ for readability.
As a doctor who mostly treats back pain, I’ll say this:
Back pain is not a diagnosis. Translating the phrase “back pain” into Greek or Latin does not improve this situation.
You need to determine what is causing the pain (bone, disc, nerve, joint, muscle, etc.) in order to treat it properly (if it needs treatment).
Most back pain will get better on its own, in spite of, not because of, your home remedies and advice from the Holistic Healer of your choice.
Chiropractors are not doctors.
If it goes down your leg you can call it “sciatica;” if it doesn’t, you shouldn’t.
Damned L4-L5 joint!
Things to know about information theory and “soft” computation: 1. “Information” can be thought of as a measure of “surprise” (i.e. stuff you couldn’t deduce without checking). 2. “Complexity” is a measure of how hard it is to reproduce something; it’s about the recipe, not the appearance. 3. When something relies on a bit of random luck, that doesn’t mean it’s completely out of your control. There’s a lot we can do to make ourselves luckier, and it’s often much better to be lucky than to be certain, because certainty usually turns out to be impossible or unattainable. 4. The distinction between “digital” and “analog” is an illusion; everything is analog, digital is just a state of mind.
4.: Yes. Digital circuits just force a 1/0 decision at some (range) of threshold voltages. The system design drives them far enough to either side of the decision point to reliably reflect one state or the other.
“Reliably” is a tricky word. The decision threshold is not only analog but randomly variable. So are the timings: when does the signal switch, how long does it take, how long does it stay, when is it sampled by neighboring devices, for how long do those neighbors remain sensitive — all analog and somewhat non deterministic. It’s a wonder that this stuff works as well as it does.
Thank you! This brief interchange has plumbed the (shallow) depth of my knowledge of micro circuits!
(I sit about 50 feet from the engineers that design them, so I always just point questions to them! Essentially all of our ICs are custom.)
But your explanations nicely show how everything in real life (except at the quantum level) is a distribution (samples of an analog reality). I try to explain this to people.
It never really came home to me until I dug into applying statistics to data (at work). Then it clicked.
Surprise! Quantum systems are “analog” too, even though they are described by discrete states. They’re really their own thing, but Id put them closer to analog concepts than digital.
Very interesting! They are certainly not portrayed that way in popular science writing.
Perhaps a bit out of context, but the golfer Gary Player said “the more I practice, the luckier I get”.
We can make our selves luckier.
It works in a lot of different senses. I was thinking of how serendipity in randomized algorithms is essential for efficiently solving approximate solutions to NP-hard problems, but couldn’t think of a way to put it in lay terms. Golf is more of a noisy control problem, but that works too.
(I haven’t looked at NP Hard problems since I tried CS at uni many moons ago)
I just wanted to mention that I am also currently reading Sean Carroll’s book and I think it is excellent. I have come to describe myself as a ” Naturalist” as described in the book: Naturalism is the understanding that there is only one world, the natural world, exhibiting patterns we call the laws of nature and which are discoverable by the methods of science and emprirical investigation. There is no separate realm of the supernatural, spiritual, or divine; nor is there any cosmic teleogy or transcendent purpose inherent in the nature of the universe or in human life. Finally, purpose and meaning in life arise through fundamentally human acts of creation, rather than being derived from anything outside of ourselves.
Agreed. I read it a month ago and found it to be an eye-opener.
I’m not very conversant with Islam, but I recently learned about the origins of Shia and Sunni and it’s very similar to what happened in the LDS church. (As a former Mormon, I am very familiar with this bit of history.) When Joseph Smith died, there was a disagreement about who should succeed him, Smith’s son, Joe Junior, or Brigham Young, who was pretty much second in charge of things. As much as Smith’s wife wanted her son to become prophet, most people sided with Young, so the Smiths went off and founded the FLDS church. This is basically what happened to Islam so many years before hand. The majority (Sunni) went with the popular leader while the Shiites went with the familial leader, the son-in-law of Muhammad. As with the LDS/FLDS, those that went with the popular leader became the majority while those that went with the familial route became the minority.
My area is physics (General Relativity, very a propo this post!).
Special relativity is more easily understood as Newtonian mechanics with a speed limit, that of light. It can deal with accelerations just fine, but it doesn’t equate accelerations due to propulsion with accelerations due to gravity–that’s where GR comes in.
General relativity simplifies to special relativity in weak gravitational fields, i.e., when spacetime curvature is negligible. What we perceive as gravity is geometry.
That said, I think I would be quite happy for people to know Newton’s laws of motion, the laws of thermodynamics (the 2nd is tricky, I know, but it’s worth it to get it right); Also, people need to stop saying a couple things:
1. Never say “Einstein said everything is relative.” Please, just don’t.
2. Find out what the word “quantum” means before using it. It’s not that hard (well, maybe it is for Chopra et al.). Use the Feynman test on yourself: if you can’t say the same quantum thing without using the word “quantum”, you don’t understand it.
3. Avoid the use of the word “energy” unless you mean something similar to “capacity to do work measured in Joules, eV, etc.”, or provide an alternate definition before using it.
I learned a new thing today: The electronvolt (eV) is a unit of energy, not electrical potential (should have known that already …).
One electron volt is the unit of energy that would be imparted on an electron that passes through a potential difference of one volt. It is very little energy.
Yes, when I looked it up, they were using a lot of unusual (to me) prefixes to the numbers (smaller ones that nano- or pico- or femto-) when comparing an eV to other energy units.
Hellö Hector Mata,
I recently found this intriguing visualization of relativity, if accurate, the best I’ve come across so far.
My Attempt at Visualizing Special Relativity (by carykh)
My question, if you’re interested, is it accurate (enough, for non-experts)?
I *think* it’s right, and it’s entertaining. I think it’s a bit convoluted, but I couldn’t find anything obviously wrong with it.
“1. Never say “Einstein said everything is relative.” Please, just don’t.”
Most fervently agreed!
“2. Find out what the word “quantum” means before using it.”
And that applies most cogently to the myriads of ignorami who talk about a “quantum leap”. I do not think that means what you think it means, fellas.
I think “quantum” leap is destined to fall into the same linguistic category as “decimate” which is now understood to mean the opposite (or numerical complement) of its original use.
Yes! And your use of “numerical complemnent” is well put! 🙂
Stuff I want creationists I have recently met to understand about evolution.
Evolutionary theory neither claims to understand the origin of life, nor the exact causes of the Cambrian explosion, but doesn’t need to in order to be credible.
(More generally, if a jigsaw puzzle is only partially assembled and you are stuck, it does not mean the parts that ARE assembled are done incorrectly.)
No one claims humans are descended from modern apes.
The fossil record shows clearly and unequivocally that humans did not co-exist with dinosaurs.
If and only if you do not consider present-day birds to be dinosaurs.
Missing out the “non-avian” qualification in statements such as “dinosaurs are extinct” is just leaving a hostage to fortune.
What a beautiful “key-points-of” summary Jerry.
If this were only the beginning of a whole new movement which would eventually encompass all topics of science.
I myself briefly considered just how I would treat “Computer Science” – it’s bloody difficult!!!
I think it was Einstein who once said something like “you don’t really know a subject until you are able to explain it clearly to your mother”
A favored read of mine available for at least a decade and in many public libraries: UCLA Dr Rose’s treatise re us blue collar service – workers: “The Mind at Work: Valuing the Intelligence of the American Worker” of isbn 0143035576 and https://goo.gl/OXpQsV.
From my field of computing: I think it would be nice if everyone understood the basics of how software is constructed and some elementary programming (for example, with Logo)
From my field of philosophy: the rudiments of propositional logic as well as some grasp of argumentation schemes, including various inductive and abductive ones. Which? Dunno. Yet.
I suspect the field I spent my working career playing around with – Logistics, is an area most people either don’t know much about or more likely don’t care about. However it has great influence on nearly everything around you, particularly over the last 50 years.
Once considered primarily a military term having to do with the movement and supply of troops it employs millions today in the civilian world. Logistics can be divided into several parts but a large area now is in procurement logistics and distribution logistics. It can be the management of materials from raw product to finished goods but often only includes parts of this chain. There are many logistics companies, such as UPS or private and commercial trucking as well as warehouses or distribution centers. There are also sizable logistics departments within many major companies like Walmart, many other retail firms and most likely within the companies where you purchase your food. Today, the computer and logistics world are merged and there is hardly any part of logistics that does not involve computers.
Somebody had to post it…
What a fine commercial ! Thank you, p and Mr Schenck, for these explanations.
An aside: a kiddo of mine is, among other of his specialities, a legal aide attorney at the southwestern border dealing there in labor matters. He told me that, as a law student, an assignment of his then (a decade ago) involved an investigation into workplace – sustained injury and that workers in logistics’ carriers such as FedEx and UPS, just in terms of sheer numbers of on – the – job accidents resulting in trauma, impairment or suffering, outdistanced those experienced by farmers, miners or other such risky endeavors. I did not inquire of him re warriors’ incidents.
I suspect your kiddo is right. The lifting of heavy items is one of the highest causes of injuries for sure. For a few years I worked in a distribution center in California and the workman’s comp laws there made it possible to take some serious time off with the system. I would say we had 15 to 20 people on workman’s comp all the time in a facility with 500 employees.
Nice. Still, it’s not exactly the Fed-X delivery Tom Hanks made at the end of Castaway though, is it?
O Mr Kukec, but .that. film and Mr Hanks in it ranks right up there in my registry of top big – screen performances of all time.
No, not the same delivery is his at its conclusion; but, a bit earlier, when (character) Mr Noland’s closest ally, (character) Wilson the Volleyball, slowly but permanently drifts out of his realm, always my reach then is for the nearest nearly full tissue box.
Golly, that scene of Mr Noland’s yearning just freakin’ wrenches, not ?!
It’s an area I am curious about.
I have worked on the wharves for over thirty five years. The transition from manual handling of cargo to containerisation and then computerisation and automation has beeen amazing.
I allways wondered about the planning and organisation behind it.
As someone interested in logic, I wondered about the sound-alike field briefly once. How does one get into that area, anyway? You’re right that it is of supreme importance, influencing any number of practical matters …
My area was the Civil War era of American history. I’ll try to distill thousands of books and articles on the cause of the war to one short paragraph.
The South seceded because the Southern ruling class did not consider slavery safe under a Lincoln administration. The ruling class believed that the institution had a better chance of surviving in an independent slave republic. Lincoln used military force to put down the rebellion to preserve the Union, not end slavery. As time went by, the exigencies of the war allowed him to end slavery as well as preserve the Union.
Just a hobby for me but I love it. Very well said, especially the “ruling class” because they are the ones who made secession happen, not the population as a whole. The fire eaters.
May I ask, what is your favorite history of the Civil War (for the general reader and otherwise, if appropriate)?
I recently read McPherson’s, Battle Cry of Freedom and I couldn’t put it down.
I also recently read D.H. Donald’s Lincoln and found it excellent as well.
Some years ago, I tried to read Foote’s history and couldn’t. Too much detail. (“Too many notes, Mozart!”)
James McPherson is generally considered the greatest living Civil War historian. You made the right choice in reading Battle Cry of Freedom to get a modern one volume treatment of the war. If you are interested in reading in more depth on the topic, I would suggest any of the many volumes by Allan Nevins. This choice may surprise some because Nevins wrote in the mid-twentieth century. I read much of Nevins many years ago, but I remember it was great literature as well as history. Even though some of his interpretations in his earlier volumes are no longer accepted by most historians (he was too pro-South), his writing style is gripping and will engross non-specialists in the area with the drama of the time. In his later volumes, Nevins emphasized more the role of slavery in the coming of the conflict. The other great Civil War historian of the mid-twentieth century was Bruce Catton, another great writer. While most of Catton’s work concentrated on the military aspects of the war, Nevins wrote about both the political and military.
It is a shame that there are very few historians working today who can write like Nevins and Catton. Most historians write in an academic style that thereby bores the non-specialists. Doris Kearns Goodwin’s Team of Rivals is an exception. It deals with Lincoln and his cabinet who helped to win the war. It is over 700 pages, but I think you would enjoy it.
Finally, if you’re interested in reading historical fiction about the Civil War, nothing tops Michael Shaara’s Killer Angels. It deals with the Battle of Gettysburg. It’s fairly true to the historical record. It made the previously little known Joshua Lawrence Chamberlain into a virtual rock star for Civil War buffs. The movie Gettysburg was based on this book.
Thank you very much!
I remember some of Catton’s books being on my parents’ book shelves.
I will pick up Team of Rivals and Killer Angels for sure. I love well-written historical fiction.
Start with The Killer Angels. It is mind-blowingly good.
I’ve heard there may be more books on Lincoln than Jesus. Besides what you have already listed, I found Eric Foner – The Fiery Trial, was pretty good and also, Bruce Levine, Fall of the House of Dixie.
Thanks; I’ll be reading some of that too.
This Republic of Suffering by Drew Gilpin Faust is my favourite Civil War book.
Thanks for all the comments.
Why not educate the public about your area of biology, speciation, by filming a discussion between the top researchers?
What about the 4 horsemen of speciation?
Coyne, Orr, Nosil and Feder
I would pay to hear that talk.
Despite the name, most of what you will learn in computer science has very little to do with computers – that’s for engineers. I can’t remember who said it or what the exact quote was but was something like this: “computer science is no more about computers than astronomy is about telescopes.” It historically has been a sub-discipline of mathematics, in fact many universities consider your degree to be a degree in applied mathematics. Of course it certainly helps to understand how actual computers operate and especially what their limitations are but at least for my degree it wasn’t core to the discipline. I would also emphasize that computer science could be considered a sub-discipline of linguistics.
Computer programming is a skill that is used to build applications, and one doesn’t need to be a computer scientist to be a good developer, but it certainly helps when you have to design and architect large complex systems to understand the theoretical underpinnings of CS.
I try to explain this to my parents but they still ask me to fix their computer every time I visit.
I think you make a good distinction here, but when I did my degree in computing, back in the early 90s, Computer Science was at the techie electronics/instruction-set level that had everything to do with computers as machines.
My degree, at Brighton University UK, was a new-fangled course in ‘Computing and Information Systems’, classified as BA rather than BSc to try attract non-techies. It emphasised the need for business-focused requirements definition, the need for an essential ‘specification’ foundation that the programmers could then weave their magic against. Clearly we didn’t need to know how computers worked to define business-requirements.
I think one of the wonderful aspects of computing is the varied levels it all works on, how one builds on another without a need to know how the lower-level actually works i.e. there’s the electronics/CPU/instruction-set level; abstraction to high-level programming-languages; logical database design/modelling; application concepts; creativity of business solutions and machine-human interface; project-management and delivery.
As to which is the most important/effective/higher-paid – well, that requires the gathering of a representative of each discipline, lots of wine, and a long evening of debate!
Haha. Parents never understand our jobs. One thing I’m sure you’ll appreciate is the lack of understanding that wifi, Bluetooth and cellular (whatever frequency) are all possible through radios. It’s all using radio waves. I find people just don’t get this and it’s a fairly straightforward thing to grasp and important in understanding how our stuff all works.
My parents always had an outstanding summary of my chosen profession (Computer Science) which never failed to bring forth a smile – “Whatever they are paying you Howard, it’s not enough”
Since computation is a real-world process, computer science *is* largely a factual science (not a formal one like mathematics), though there are pockets that are formal, like (arguably) the theory of Turing degrees and a fair bit of other stuff in “higher” recursion theory. However, you’re right, it doesn’t have to do with computers, it has to do with *computation*. (In fact, my last degree is in “Logic and Computation” to emphasize that it is *that* aspect that matters.)
There was a joke when I was an undergraduate at McGill that computer science professors never used computers.
The astronomy/telescope remark was made by the late, great Edsger Djikstra. He also said something to the effect that just because he was a computer scientist he didn’t have to inflict a computer on himself any more than medical researchers had to infect themselves with the diseases they were trying to cure – and he did all his own work with blackboard, chalk and typewriter.
My degree is in physics (a long time ago, when a first degree ( In theUK) was worth rather more than it is now), and I spent half a century in the semiconductor industry. After a couple of decades I began to appreciate the importance of statistics. Over the next three decades I taught myself enough that all of my colleagues who knew anything about statistics reckoned that I knew more relevant to the industry than anybody that they had ever met. My attempts to pass on this expertise were not terribly successful; my colleagues had never met statistics beyond the basic and boring recipe level: plug in these numbers and this is what it means. I see this even on this admirable web site; even PCC(E) missed the point a few days ago when he took a single example of a record temperature in the UK as evidence of global warming. It can be easily demonstrated that the larger the sample, the more likely is an extreme value.
So, my nomination for what constitutes an educated person is an understanding of fundamental statistical principles.
Well, the devaluation of the degree is an unfortunate but inevitable consequence of the “everyone must go to university, everyone must have a degree” meme.
Be able to identify a plant using a flora
Understand genetics and the role of plants in it’s history and feel sorry for monks who had to eat wrinkled peas.
Understand what we know of photosynthesis and why it’s flawed
Have patience when people ask you what’s wrong with their house plants
Not my profession but my avocation is officiating high school sports, specifically, baseball, basketball, football (American), and softball. My particular problem has to do with non-officials rule knowledge. Over many years, I have been bombarded with comments about rulings I make that refer to what they have seen on TV. So I would like to get people I deal with to understand that what they hear on TV may not be the high school rule.
E.G. The term “uncatchable pass” is not in the high school rules.
Baseball umpire here. I agree that baseball rules knowledge can be sketchy among “civilians” (and even participants). Many rules of baseball are quite technical, convoluted, and only occasionally come up in games. There is little reason to expect non-experts to know all of them. As a very good catcher once told me: “We don’t know the rules. Why would you think we should know the rules? We have you for that.” A lesson, I think, is that when things need expertise and study to understand, it makes sense to consult the experts, and usually makes sense to rely on them. They aren’t always right, but that’s the way to bet.
Now, about that “infield fly rule” … 🙂
I can never keep general and special relativity straight, either, but I do remember that one of them (can’t remember which) is much easier to understand than the other. Not that that helps me any now….
The way I remember it is that SR is a special case of GR. SR is easier to work with mathematically, as you don’t need tensors.
I think of SR as being a special case for things moving fast; GR is when you worry about gravity distorting spacetime. (That seems reasonably close to Sean’s answer.)
Where are the humanists in this discussion?
In my ideal curriculum as a professor of literature, the goal was to examine the cultural roots of story: where it came from, how it developed as one of the important uses of language, how it diversified generically (memetically, genetically)and, of course, how it worked on our minds, in the past as now.
Yet in practice I mostly taught reading. And as the years went by I taught more and more basic reading. And by the time I retired two years ago, college reading had become what high school reading was when I began 40 years before.
So teaching anything like my ideal curriculum was impossible.
“…the goal was to examine the cultural roots of story: where it came from, how it developed as one of the important uses of language, how it diversified generically (memetically, genetically)and, of course, how it worked on our minds, in the past as now.”
Sounds fascinating. It wasn’t done that way when I was in high school (1955-58) or college. Any references?
My graduate work was at the University of Chicago (1966-’70), Dept. of English. It was a heavily analytic curriculum, based on a rigorous understanding of Aristotelian philosophy, which was put into practice on literary texts through the ‘Poetics.’ Two other foundational books were of almost equal importance: Eric Auerbach’s ‘Mimesis’ and Noam Chomsky’s ‘Syntactic Structures.’ Taken together, these gave students a fairly coherent view of language as a universal among human cultures and, it followed, individual human capability.
Although it was early days for anything like evolutionary psychology, we young professors thought we could see that there was a common tendency toward story wherever we looked in the literate past; from that we reasoned that it was present before then as well, probably almost coeval with language itself.
Further, while we didn’t make the connection between story and its contribution to species survival that is now widely posited, we did explore that (connection) between story (telling) and human emotion, along with the persistence of story types (genres) over time and place.
Please note that our conceptualization, though we thought of it as ‘universal,’ was in fact only strictly applicable to the Euro-American West for the last four millennia. Nor was it anthropological so much as a kind of literary archaeology–one the proof of which’s assertions could be found in the recreation of the texts themselves. That is, through reading or performance: ‘Agamemnon’ still moves us today, and the better the reading/performance, the more we are moved.
This kind of teaching literature was a good defense against relativism. But, as I wrote in the original post, it depended upon a constant level of literacy among entering classes of students at the liberal arts college where I professed. This level, alas, began to slip sometime in the early 1990s, and so our beginning point in the curriculum had to be pushed back a bit each year. That in turn meant that our ending point was further and further from Parnassus.
I retired in disappointment. But I yet believe that something like the curriculum I’ve here described is a strong way to achieve both better literacy and maintain the liberal arts goal of understanding western cultural continuity.
There are Humanities folk here but I think most are lurkers. I occasionally see them pop out when coaxed. 🙂 But isn’t that the thing with Humanities? I find the kooks come out all the time, leaving the impression that the Humanities is populated solely with them; the regular folk rarely appear and go about doing their thing. There don’t really seem to be any popularizes of the Humanities as there are in Science.
Old humanist soldiers like me get it from both sides. One the one side, scientism (as outlined by Rosenberg) makes a very strong case that our discourses cannot lead to knowledge but are simply ‘stories about stories’–at best, entertainment; at worst, gobbledy-gook. On the other side are the raving relativists and sjw’s, who denounce any speech, essay or curriculum that even hints at positive value in the museum of texts from the western past (such as my suggestion above that ‘Agamemnon’ can still move us today, which they would find an outrageous cultural lie).
Well, I cannot meet Rosenberg’s argument in a satisfactory way (believe me, I’ve tried). Nor do the popomo’s (as I now must call them, since there’s another generation of the vicious little creatures scurrying around my feet like mini-me’s)show any interest whatsoever in dialogue according to the precepts of reason and evidence.
By the way, I enjoy your frequent posts on WEIT.
Agamemnon! My God the SJWs Would froth at the mouth with all that he got up to!! 🙂
I enjoy your posts too!
Social Justice Warriors.
An ironic term, usually derogatory, used by right-wingers, middle-of-the-roaders, and most old-leftists to refer to the more over-enthusiastic activists on the left.
I am partially one by training, though I do not work professionally as one anymore. (See above about logic and “argumentation theory”.)
I’ve heard the same about falling standards. What’s your guess as to why?
I have long pondered this question, Mr. Douglas, though alas not to any positive realization. The best I can do from a long retrospect is to say that our U. S. society is now more than half a century into an accelerated self-realization ethic for the education of its middle-class children and young adults that, like time’s arrow, is allowed to move only in one direction: excelsior!
If students fail, it is the institutions’ fault, and standards must be adjusted accordingly (downward). The bottom moves toward the middle; the middle moves toward the top; and the top either gets compressed and crowded or the very best students from the most affluent families go to prep schools and academies before taking their predestined seats in ivies or other exalted seats of higher learning.
Garrison Keillor’s quip about Lake Woebegon, ‘where all the children are above average,’ was funny until the day we teachers looked around and discovered it was true. By the time they reached college, a ‘C’ was pretty much a failing grade in students’ eyes. For at every juncture in the educational journey, parent and student expectations were higher than student capabilities (generally speaking). Yet those expectations were typically met–by faculties pressed by administrations pressed by school boards pressed by pat’s. Straight-A students abounding; dozens of valedictorians.
Too many of these same whiz-bang students, however, were not well prepared for the demands of a college curriculum. And I don’t mean only in the STEM subjects. Expository writing was, and remains a serious weakness among college matriculants. My guess is that over their previous K-12 grades they simply have not been made to read enough, nor closely enough; for hidden somewhere in the morass of writing pedagogy (almost all of it failed) is the strong link between engaged reading and good writing. Do the one and you’ve got a powerful leg up on the other. It’s a matter of steady incrementalism until the ‘critical mass’ of literacy occurs. Then streaming out into the universe of reading and writing.
Suggest this to the state boards of education, to the schools of education, to the curriculum specialists, etc., and you’ll always get the very same answers (providing they’ll notice you at all): ‘studies show that. . .’ and ‘there’s not enough time to. . . ‘ As if social ‘science’ mapped anything about reality and time were fixed in unalterable allocation.
“if you feel so inclined, write a sentence or paragraph about your “area”, and what people should know about it to be considered “educated.”
Cool question. I’m getting my PhD in public health genetics (PHG), a transdisciplinary area that combines expertise from four main fields–genetics, biostatistics, epidemiology, and bioethics.
Yesterday I downloaded the .csv files from Google Trend queries I did comparing the relative popularity of various topics in public health genetics and plotted these in R and then used Markdown and knitr to publish the graphs online:
I did this for a variety of reasons:
1) It demonstrates that I can play with and display publicly available data (a biostatistics/epidemiology skill)
2) It provides a way for me to share what my program is in Twitter, and, as an unanticipated bonus, here. Most people have never heard of PHG, as it is the only program of its kind in the country.
3) Getting to play around with this improved my computing skills, which are relatively nascent compared to people those of those who spend all day programming. Being transdisciplinary, my time is split integrating knowledge, communicating, and writing, with only minor excursions into code. So, playing with this was fun and sparked me to learn plotly (a way to produce interactive graphs):
I am not a software expert, but at one time early in my career I spent a lot of time designing and writing software for the company(ies) I worked for, and all of the computer talk upstream brought back my memories of a few things I fervently wished people would understand about using computers.
#1 – Not every task benefits from being done on a computer.
And like the GIGO statements above, automating crap processes just makes it crap out faster.
I think it’s also important to realize that much change is “just” random. First people need to understand natural selection, but then they should grasp that not all things are adaptations. Even some things that are adaptations now weren’t adaptations when they started.
Two additional evolutionary facts that everyone should know about; these are actually the Big ones for me, because I find these most inconsistent with non-evolutionary explanations of life:
(1) Sub-optimal “Design”. For example suboptimal design in humans, such as crossing of air and food paths in the pharynx (-> choking); the recurrent laryngeal nerve; and a long list of features in the human genome (pseudogenes, transposons, etc) that don’t make sense under the hypothesis of special creation by a benevolent creator.
(2) Biogeographic Distributions. For example closely-related species tend to occur in proximate localities, such as the occurrence of great apes only in the old world, and the closest ape-relatives of humans in Africa; or phylogenetic relationships reflecting precisely the chronological order of geological events, such as the origin of the Hawaiian islands.
Everyone should also know that the solar system is about 4.5 billion years old, and that life has been on Earth for about 3.5-4.0 billion years.
As a criminal defense lawyer, I’d like people to be aware of their applicable constitutional rights: the Fourth Amendment right to be free from unreasonable searches and seizures (in particular, from having one’s residence searched except pursuant to a warrant issued by a detached and neutral magistrate upon a showing of “probable cause,” supported by a sworn affidavit, that evidence of a crime will be found); the Fifth Amendment right to be free from compelled self-incrimination; the Sixth Amendment right to a speedy and public trial before an impartial jury at which one is represented by competent counsel (of one’s choosing, if one can afford it; appointed at public expense, if one can’t) and has a right to confront and cross-examine witnesses; and, if one is convicted, the Eighth Amendment right to be free from cruel and unusual punishment.
That’s not quite comprehensive, but it covers the highpoints.
As an attorney specializing in civil litigation, the one thing that people should know, is to avoid civil litigation at almost any cost.
First you have to explain how to avoid getting sued. I think the only way to do this is stop breathing.
That’s a great answer to the question, thank you!
Ha. I am currently changing career again, on leave (well) before starting in two weeks. Oy, it was a long time since I did research, and it seems that every time I move further away from my roots in physics.
But I guess I could summarize the old area and to tie over into the new use my main interest which is constraining my choice of new career.
Embedded programming, for astrophysics and what not:
A computer doesn’t need to have a display, and a computer web doesn’t need to be Internet. Your car has a web of built in computers that monitor, communicate and control its functions.
Phylogenies, our evolutionary histories, run deep and wide. The oldest common ancestor of all life may share traits with the geological formation that we descend from. [ http://www.nature.com/articles/nmicrobiol2016116 ] But even if that would turn out to be wrong, all life on Earth share the same genetic machinery. And as a biological machinery it is almost as old as our planet.
Conversely the evolutionary tree, the diverging histories of species as they became isolated and change independently from each other, now has grown into billions and billions of dead and living species.
That last part was supposed to be one paragraph by the book. Dunno what happened.
As a doctor:
1. More medicine DOES NOT equal better medicine (i.e., better health). There is an epidemic of healthcare over utilization going on that not only accounts for America’s out-of-control healthcare costs but also much of its risk. We literally can do as much harm overdiagnosing and overtreating as underdiagnosing and undertreating.
2. Antibiotics are not harmless and are abused at nearly the same rate as illegal drugs (no studies to prove that one; just my gestalt; you get the idea).
3. Most diagnoses are made by a carefully taken history and physical exam. Technology is best used to confirm and clarify. (That’s a really hard one for people to believe, but it’s absolutely true.)
4. The most likely diseases to kill us are also those we have the greatest power to avoid: if we all avoided or minimized alcohol use, smoking, and saturated fat and exercised regularly, slept roughly 8 hours/night, maintained solid social relationships, and learned to deal constructively with stress, our lives would not only increase in duration but also in quality.
5. Having a strong relationship with a primary care physician to whom you have good access is probably the best way to extract maximal value out of the medical system while simultaneously minimizing the risk of harm.
Thanks for reminding us of this basic information regarding personal health. It is both a big and difficult decision to decide which tests and/or treatments outweigh the risks.
Thanks, doc. I did not know a lot of that, but it makes sense.
(Things “making sense” and my “not knowing” tend, unfortunately, to have a strong positive correlation in my case.)
As a doctor, too, I would add: All of us have to die, irrespective of the medical care received.
And as a psychiatrist (my field), “Life is not fair” and “There are pains -soul pains, not that I believe in any soul- you will have to endure”.
Hey, you forgot the trigger warning! 😉
Reblogged this on My Selfish Gene and commented:
How do you measure up? Coyne posits five things an educated person should know about evolution. Interesting. And as always at his site, be sure to sift through the comments.
Toxicologist (Drug Safety)
The dose makes the poison. Everything is safe at a low enough dose (homeopaths take note). Everything is poison at a high enough dose.
Yes, this is a great one. So succinct and so important.
In particular, oxygen and water are poisonous at high-enough doses.
Wasn’t Paracelsus the first to propose that?
One of the relatively few contributors I liked on The Conversation websites. Ian Musgrave “Paracelsus poison”
I didn’t say it was original 🙂
Please come lecture the personal trainers and dietitians who work at the gym I belong to.
Actually I do that for a living but usually to medics.
What’s the LD50 of vitamin C in humans, anyway? 😉
D.R.S. – Differential Replicant Survival.
Biological evolution is a consequence of a genetic system of inheritance.
No replication can be perfect hence variation in the form and function of the phenotypes occur. Environmental selection of favourable phenotypes/genes = biological evolution of populations. Darwin described the behaviour of genes before anyone else (Mendel?) knew about genetics.
Degree in Computer Science and worked as a software developer ever since. What I think people should know about the field is:
* IT and Business do not speak two different languages. Bad communication should not be attributable to an unbridgeable gulf, but down to the individuals and processes involved. We all speak the same language, we shouldn’t hide behind stereotypes and cliches for when that process is going wrong.
* Concepts that humans come up with are fuzzy in nature, and the idea that we can logic everything purely by thought is impossible. The fine details are going to be a compromise effort, and the ability to recognise that is one of the most important steps in improving software quality.
* Technical skills are necessary but not sufficient for the software development process. The so-called “soft skills” are what makes the difference between good and bad project outcomes, and this just as true between developers as it is with everyone else.
* Software developers are human, and software development is a creative endeavour. High quality outcomes aren’t going to result by having people work crazy hours or in high-stress environments.
As a statistician, I have a lot that I wish people understood more clearly about the subject. I find that the most important misconceptions are different though depending on if the subject is a layperson, or uses statistical tools on a regular basis. So,
For the general public:
(1) All estimates are meaningless without corresponding measures of uncertainty. Indeed, statistics could be adequately defined as the study and quantification of uncertainty. What this means is that it is useless to know only that, say, group X is 5 times more likely than group Y to develop cancer in the next 20 years. You need some information about how uncertain that estimate is in order to infer anything, even just knowing the sizes of the groups used to derive the estimate gets you practically all the way there.
(2) Virtually all the statistics you see communicate information about what is *typical*. They often give you information about an average response, a general trend, a most common effect, a relative risk. That means there is *always* room for counterexamples to that typical response. In fact, the existence of such counterexamples is usually guaranteed, and this does not provide evidence against the typical response. So, for example, saying that not all smokers develop cancer adds nothing to the fact that smokers have a much higher risk of developing all sorts of cancers relative to the nonsmoking population.
(3) Related to both of these, census information is possibly the most valuable information that anyone can ever gather, precisely because the nature of a census is that *we measure the whole population*, we don’t just draw a sample. So the uncertainty (practically) disappears. You can’t get better information that that. Happily support your government census, because your country’s resources and taxes will surely be inefficiently used without it!
For applied practitioners:
(1) Don’t just apply a statistical test to attach a number to a result, or because that’s how other people do it in your field, or because that’s how you were taught to do it. Think about the kind of uncertainty you are trying to quantify, and then you can decide on an appropriate statistical treatment (or maybe none at all!). Numerical evidence does not make your work more objective. Numbers are convenient tools to talk about results, but they do not ever make your results.
(2) All analyses and all statistical tests require assumptions (sometimes very strong assumptions) that should be thoughtfully considered and assessed to the best of your ability.
(3) Think about what you are measuring and how you are measuring it. Once you go ahead and make that measurement, it’s likely going to be infeasible for you to go back and do it again if you find later that something should have been done differently, or the setup was less than ideal.
And finally, one item that applies to both the general public and applied practitioners:
(1) Fancy math and complicated looking formulae do not automatically translate into better results. In fact, in practice, it is more often the more basic analyses that turn out to be the most informative. Do not be wowed into believing something just because the math looks fancy!
Indeed. Some level of familiarity with statistics should be something everyone has in their cognitive toolkit.
Beautifully written, Ed.
That was very interesting. Thanks!
Thanks for that! I love efficiency statistics but I am limited by my math issues. I have used it in a limited function in process improvement 6 sigma) and that bit about figuring out the right statistical test is so true. I will always cling to the mathy people at work to help me (easy to find because I’m in IT). I also like your measurement part. I love to measure but to root cause and solve problems.
Also, the census! So important! The last federal government fired its statisticians and abolished the long term census. I thought this was a slippery slope to totalitarianism where data don’t matter and ideology is everything. Glad those guys are gone and our new government reestablished the long form census.
Not sure where “efficiency” came from up there.
Yes, the abolition of the long-form census by the Harper Conservatives was one of the worst things that government did, and I’m extremely glad the Liberals brought it back first thing. It really speaks to the mindset of the Conservative party here (and elsewhere I suppose). If you eliminate the facts, then you can simply fight an ideological battle. Every policy issues reduces to your opinion versus mine. This is a truly insidious way to govern and if the general public understood this dynamic better, the Conservatives would lose a lot of votes.
And notice how pomo denial of truth just plays into this. The powerful know perfectly well that the truth matters – if only to deny it – so the weak get stomped on if they do not have the weapon of what is the case … (however uncertain)
As someone who works at a national statistical agency (in a support role – I do software development), thanks for the plug about censuses! (Those who have seen me around know why …)
I have a problem with the Many Worlds interpretation of quantum mechanics. An infinitely branching tree of possible worlds, none with a privileged status, is a lot to swallow. It may be accurate, but “Extraordinary claims require extraordinary evidence” as someone once said.
By the way, I loved Sean’s book, Kindle edition. It straddled the boundary of what we know and what we don’t know. The farther I got into the book the more I had to think and search for what he meant. I came away with a distinct impression that he’d be called a compatabilist in these parts. 🙂
He would, and that’s one part of the book I don’t like. The admission of determinism by compatibilists is usually grudging, followed by a refusal to admit that determinism versus libertarianism has huge consequences for society: much greater consequences than the semantic quibble between compatibilists (who, after all, are determinists) and “hard” determinists like me.
Everettian MWI is certainly counterintuitive. But that alone doesn’t make it “extraordinary.” All that’s required, per my rudimentary understanding anyway, is acceptance of what the QM equations actually say.
Allow me a retraction: calling my understanding of MWI “rudimentary” is probably way too generous.
I’ll grant that it at least points to a serious problem with our understanding of quantum mechanics. It doesn’t solve it to my satisfaction.
All of quantum mechanics is counterintuitive. After all, our ancestors needed to see lions and females (to name two), not atoms or energy fields. (I know, all we do see are energy fields.)
Everett just says, as you point out, to trust QM and to make sure the wave equation doesn’t leave out anything important — and that means the observer. Then let the math roll…
After having thought about it a while, I find it no weirder than multiverse theories.
Sorry, girls. I meant lions and people of the opposite sex. No, other sex. Maybe I should quit, hoping I’m ahead…
Things the general public should know about music: oh, who the hell cares.
Well, I do. What did you have in mind?
The general public thinks it already knows everything it needs to.
It feels that way about every subject of study or research — music, evolution, physics, nutrition, medicine — everything. That’s the problem.
My job is to edit written material. Press releases and write ups about events etc. My one piece of advice would be to take a moment to hit Spellcheck at the very least.
But isn’t that what you’re there for? 😉 I remember being given real junk to “edit” back in the day when I was an editor. It was better if I just wrote the whole thing because what was given was mangled so badly!
When people ask, I say I’m a protein chemist. In reply, someone once said, “I didn’t know there were any of those anymore.”
Anyway, the first thing is you need to learn the chemical structures of the 20 amino acids. You can largely appreciate evolution without knowing the detailed structures of the nucleotide bases. But knowing the structures of the amino acids will enhance your appreciation considerably. Next, you need to learn the one-letter code for all 20. With that, and since Venn diagrams were just featured here, you’ll be able to understand and appreciate this one.
In tandem with that, understand that DNA encodes proteins, and that proteins are linear polymers of amino acids (which then fold to specific conformations – secondary and tertiary structures – to attain their functionality). In the linear chain, any one of the 20 aa’s can follow any other, so there are 400 possible pairs (202). The average protein is easily 200 amino acid residues long (an amino acid residue is the amino acid less the water lost in polymerization). But if we extend our hypothetical protein chain to just 61 residues, the number of combinatorial possibilities becomes 2061, which is approximately 1079. To help you grasp that this number is humanly incomprehensible, 1079 is at the lower end of the estimated number of atoms in the universe.
Now you’re ready for a table of the triplet code (no need to memorize the 64 possibilities ), from which you’ll be able to grasp that single-base mutations tend to result in conservative changes in the amino acid sequence. This facilitates evolution by generally tempering the degree of change afforded by mutation of a single nucleotide base.
In tomorrow’s lecture we’ll discuss Cyrus Leventhal’s paradox.
Geologist here. I think the most important thing from my field people should know about are the basics of plate tectonics. It’s hard to summarize, but this is my try:
1. The Earth’s mantle is solid (not molten – that’s a very common misconception) but ductile, and slowly convecting. Plate movement is part of that convection.
2. The uppermost part (~150-200km) of the mantle is cool and rigid. Plates consist of that part and the overlying oceanic or continental crust. Rigid mantle and crust together are also called the oceanic and continental lithosphere, respectively.
3. Oceanic lithosphere is shortlived, being continously created at mid-oceanic ridges and destroyed in subduction zones (deep-sea trenches). Continental lithosphere is longlived, but is pushed around, welded together and rifted apart by the processes which create oceanic lithosphere.
4. Fresh oceanic lithosphere is hot, but with time it cools, becoming denser, and subducts when getting more dense than the ductile mantle on which it rests.
5. Though generally not molten, the mantle does melt to some degree at mid-oceanic ridges because material rises and pressure drops, while temperature stays almost the same. The molten products form the oceanic crust. Melting also happens at subduction zones because the descending plate looses water to the mantle, lowering its melting temperature. Also at hotspots like Hawaii, but these are independent from plate processes.
6. Prolonged volcanism, either at subduction zones or hotspots, will create crust that is too thick and/or light to be subducted, that is, continental crust.
I have been reading about plate tectonics lately. Your summary is excellent, just what I needed. Thanks.
Ex-petroleum geologist here. I guess what people would be most interested in at the moment is fracking. It’s a scary word, but the danger of the process itself is minimal. What people should be worried about is the large amount of water that needs to be used and disposed of. What people in my own state of Texas should know is that the Railroad Commission (the responsible regulatory agency here) is being appallingly reckless in completely rejecting any connection between disposal and earthquakes right now. Reasonable regulations could deal with the hazard, but they are in complete denial.
Maybe later I explain stuff about my second career as a private investigator.
Hmmmm…There’s something about “Reasonable regulations” and “Texas” that don’t quite jibe. I think what you should say about fracking is “Since politicians are in complete denial about the risk, fracking is no longer a feasible strategy.”
Great explanation. I wish I had taken that plate tectonics course in university as an elective but I had a sobering thought that it would be a lot of work for an elective and I had a busy work load as it was. Damn though.
“I got balled up about the difference between Einstein’s General Theory of Relativity and his Special Theory of Relativity.”
The Special Theory of Relativity concerns light. Specifically, why is the difference between your speed and the speed of light always the exact same constant? This can be remembered as: “What is so special about light?”
The General Theory of Relativity concerns Gravity (General => Gravity). The General Theory is a superset of the Special Theory
since it also includes how light (and time) behaves within gravitational fields.