Wednesday, August 3, 2011

Discipline Envy

You know how you think of certain things when someone says just a random noun? Like how it's almost impossible, say, not to conjure up certain images - say don't imagine a pink elephant. Yeah, me too. I know - just breathe deep. I'll walk us through this with some free association.


Test 1:  mathematician - go!

Test 2: biologist - go!

Whatever it was that popped in your head with mathematician, shame on you for falling into stereotypes that way - thinking it's true that you can tell the extroverted mathematician from the introverted one because the extroverted one is staring at someone else's shoes.  You're a real asshole if you just laughed at that.

No matter what else one thinks about mathematicians and their social skills, the theme linking us all together is "smart".  This is occasionally true. Mostly, we're just very, very patient.  Some problems remain unsolved for hundreds of years, with lots of "smart" people trying to figure them out.  Anyway, the visual I get in my head is how separated from our science brothers and sisters we are. They get to do the fun exciting things like showing off their barking dog voodoo. And they won't share their toys, like the Large Hardon Collider. They even hide it underground since they know we're only good at finding where things cross the horizontal.  We're congenial people*; unlike scientists, mathematicians never resort to "going negative".

Almost as exciting as watching me find the equilibrium point in some predator-prey model, right? *givesyouthehairyeyeball*

No, it's true. They get to do some wicked cool shit that looks a lot like magic.

That's not why I'm envious though.

While my mathematical brothers and sisters are in someone's mom's basement with a pencil, notepad and an imac, the scientists are upstairs at the dinner party dealing with the party crashers. They get all the fucking crazies. Anti-vaxxers, creationists, homeopathy quacks, chiropractor shit, acupuncture, water diviners, Deepockets Chopra, quantum woo and all that fun stuff.

Sure, occasionally a state legislature in the US will try to pass a law defining such that pi = 3. But really? Who's going to get excited about a rounding error - could happen to any old run of the mill crazy. Nothing to write home about. This is about the extent of abuse I get.

A howler
Really clever crazies like this never come along:

I just want my own share of the crazy, scientists.

*sniffle*

* This last video is the hyperlink for "we're congenial people".

24 comments:

Spence said...

And what about the mathematicians who think pi is all wrong, and that we should instead make it tau, and define tau as two pi (the ratio between the radius and circumference) because they think it is better, because it eliminates that awkward, unnecessary constant in C=2*pi*r

Of course they miss the minor point that then introduces an unnecessary constant into the downright sexy equation A=pi*r^2

Those mathematicians clearly have too much time on their hands. Seriously. Perhaps they could take up wanking.

Justicar said...

Yeah, those are the annoying mathematicians. They're easy to stop as students though. When dealing with families of functions, these variety get really anxious, say, with indefinite integrals and my leaving a constant off until the end instead of c*, c1, c2, to carry around all the time and then combining them into just "c".

A lot of students get uncomfortable with the idea that an arbitrary constant plus an arbitrary constant is just an arbitrary constant.

And they don't always go away after becoming real mathematicians.

They're slightly less annoying than the Laplace variety proof-writing ones. But only slightly less annoying. Like, super duper tiny.

Justicar said...

By "stop" I meant spot. But spotting them helps to address them individually so they'll stop it! But it wasn't what I was meaning to say.

TheStephenation said...

I like tau. I was never actually taught trigonometry with it, but it sure looks like it would make learning trig easier than the way I learned it with pi. And I don't see what's wrong with (1/2)tau*r^2. I already have to deal with (1/2)mv^2 and (1/2)kx^2.

Anonymous said...

What is "Laplace style proof"?

Justicar said...

OK, after putting on my shpectacles, TheStephenation's comment makes more sense. I thought it read "I like tan, and that never took trig with it. I was like, wtf?

Nonny:
Laplace, and many, many others, had/have this really annoying style of skipping all of the difficult work and replacing that with "it can be shown that" and then whatever.

If it can be shown and you're writing a paper, how about showing it?
You know, save me a lot of work trying to build a bridge out of Africa or something. Fuck. I like to read on things that aren't my area, and I'm not super-magic-mathematical whiz. I have to put a lot of effort into it; if it were so bloody obvious that it could be shown, one wouldn't be writing a paper on it at this late stage in the game.

Anonymous said...

It's funny. For biologist I immediately thought of PZ. Dang.

For Mathematician I completely drew a blank. After a few seconds I thought of Dirac. Why Dirac? He's really thought of as a physicist.

Then I thought of my PhD supervisor. Then I finally remembered that I fit into that category too! As well as most of of friends from grad school.

I find it odd that I just don't think of myself as a mathematician. Possibly because I mostly publish in physics journals; but I think moreso because it is not what I feel I am, it is merely what I do. Maybe if I loved what I do more... I dunno...

-Cereal

Spence said...

The Stephanation, thanks for the reply :)

My argument about how sexy pi*r^2 could be was really mocking those who made the same argument that 2*pi*r was clumsy because of the constant. A kind of reductio ad absurdum on worrying about constants (because if you get rid of one constant in one equation, a different one will just pop up elsewhere). Essentially I'm saying worrying about where the constants end up is really not a good argument for choosing between pi and tau.

I also find it a little hard to believe that switching some constants around would make it much easier to understand. Perhaps some things would be easier to remember (and other things harder), but understanding? I don't see it myself. And to me, understanding is the most important part of learning.

The primary concern on this to my mind is not to have a mix of the two - that would be a recipe for disaster and confusion. But since pi is already well established in text books etc., the decision is made for us. Pie for two, please!

Anonymous said...

"Nonny:
Laplace, and many, many others, had/have this really annoying style of skipping all of the difficult work and replacing that with "it can be shown that" and then whatever."

Thanks, I did not know that term. If you find a glaring example where this happens and do mind sharing it, I would like to see it.

-jdaniel

Justicar said...

I don't have an example of a poorly written proof handy, so I'll write one right now.

Suppose we're interested in proving the sum of two odd integers is an even integer.

Let m and n be odd integers. Every odd integer, x, is defined x = 2c + 1 where c is some other odd integer. Therefore, we let:
n = 2c(*) + 1, and m = 2c(**) +1.

Consider:
n + m = [2(c*) + 1] + [2(c**) + 1].
It can trivially be shown that n + m is commutative, associative, algebraic and factorable.

Viz., m + n = 2(c* + c** + 1); therefore, m + n is even.
QED

This is a perfectly valid proof. It's also a perfectly shittily written proof.

Anonymous said...

Sorry to be pedantic, but " where c is some other odd integer" should just be some other integer: If x=101, c=50. Also I think it should be: +,* satisfy the comm., assoc., distr. properties, i.e. Z is a commutative ring.

Anyway, I too get frustrated when reading math papers and the authors are not very kind to the reader. But I do not think we can expert the expert in a field to necessarily be kind to the non-experts. So that's what I was wondering if you were talking about, but possibly you just meant bad math writing skills.

-jdaniel

Anonymous said...

*expect

Justicar said...

Actually, that won't work if you get x = 3
3 = 2c + 1;
3-1 = 2c;
2 = 2c
1 = c

The reason there's ambiguity in seeing more closely the features of this proof is that all of the things I say are trivially "showable" are left unshown.

So, you have to do that all on your own; since I'm the one writing and publishing (say, when this was new lol), maybe I would have been the *only* person who had at the point been able to work this particular feature out.

It would be a dick thing for me to write it that way specifically because the thing that makes it true is the thing I left out and even you missed it.

It's extremely discourteous writing if one's goal is spread knowledge as opposed to show that one is just oh so clever.

Anonymous said...

You said *every* odd integer, x, is expressible as x=2c+1, where c is also an odd integer. But I gave you an example, x=101, for which this statement is not true.

-jdaniel

Anonymous said...

Also I didn't miss the things you said were trivial to show. Rather in this type of proof, it is OK to take those things: commutative, associative, distributive as properties of addition and multiplication for the integers. You do not have to prove them.

Unless you are Betrand Russell, lol, I don't think you have to prove those. But maybe that's the level at which your were talking.

-jdaniel

Justicar said...

There's the crux of the problem though. You're saying "at this level".

I don't write proofs for my own benefit; I write mathematics so that others long after I'm dead might look at my work and learn something.

Why would I then write as though people reading my work already know it? After all, no one publishes proofs that already known . . .

I'm rather certain there isn't a special elite cabal of mathematicians who know all these secret proofs that one of us regular ones might happen upon . . .

Anyway, you haven't shown a case where this has failed. I defined it that way:

every odd integer, called x, that is defined as 2c* + 1, where c is some other odd integer.

That would exclude 101 from being an "x" in my definition there. =P

Anonymous said...

Wow! For someone who is so self-righteousness about ownership of mistakes, you seem entirely reluctant to admit a small error. Let me quote what you wrote:

Every odd integer, x, is defined x = 2c + 1 where c is some other odd integer.

This is a false statement.

That would exclude 101 from being an "x" in my definition there

So you did not mean to say every x, but only those which are expressible as x=2c+1. Well that is stupid, because then your proof will be incomplete and therefore not correct. For example it will not handle the case where m or n is =101. :0)

-jdaniel

Justicar said...

The only way you'd be able to make a claim that I'm not admitting error is if the claim is actually in error.

The axiom I chose is an entire set of constraints on x (and thus any particular c to satisfy it). I fail to see why you think that defining c as being necessarily an odd integer doesn't constrain what can decide the output.

So, you're saying I have some kind of fatal mistake such that the range of the map isn't constrained by the domain of the map.

I fail to see to see why you're so butthurt about that the statement:

every odd integer, x, is defined as 2c + 1, where c is some other odd integer fails.

C is the domain; by attenuating that, I can force whatever conditions I want on the range.

This is fairly simple, and straightforward language here.

There is no problem at all in that entire definition, unless, you know, read half of it and ignore any further restrictions.

Incidentally, the incompleteness of a proof doesn't invalidate it; conditional proofs are just fine so long as one remains within the . . . the constraints.

But sure, feel free to call me a liar if it makes you feel better.

That doesn't make the definition fail in the slightest degree though.

Also, bear in mind that I wrote that as a proof, an example meant to show off, as I said, a shittily written proof. It is technically correct; it is also horribly written. As I said. From the outset.

*smiles*

Anonymous said...

I seriously doubt that you are a mathematician. What was the title of your PhD thesis if you don't mind my asking?

Justicar said...

Oddly enough, and it's funny you should ask.

I wrote it on how to write technically correct, but still very poorly written proofs, on the internet so as to piss off blog-comment-heroes.

It was groundbreaking work, a real riot.

Might I add you to the data set?

Anonymous said...

Also what does it mean that + is algebraic?

It can trivially be shown that n + m is commutative, associative, algebraic and factorable.

Please keep your answer simple like your last comment.

By the way you real blogger comment hero. I have actually enjoyed your posts. I just want to know why you are pretending to be a mathematician.

-jdaniel

Justicar said...

*purses lips*

Instead of leading you on for the lulz, I'll refer to the already written answer to that which I stated antecedent to writing the proof:
"Nonny:
Laplace, and many, many others, had/have this really annoying style of skipping all of the difficult work and replacing that with "it can be shown that" and then whatever."

This proof was written, I presume, nonny, in response to your having asked me to show you an example of a poorly written proof.

I'm sorry to have entertained your request to write an example of a proof that had the features present in it that I said made writing them that way to be bad in the first place. I gave you what you wanted, but apparently you didn't want what you asked for.

*note to self: do not under any circumstances give examples to anyone of how something should not be written, even when they ask you to do it*

Lesson learned, nonny.

Anonymous said...

OK thanks for showing the kind of mistakes a high schooler would make instead of an actual Laplace style error.

I guess I should be grateful for your nonexample.

Justicar said...

Gee, nonny, if only I had expressed my thoughts on levels of writing or anything like that.
"There's the crux of the problem though. You're saying 'at this level'.

I don't write proofs for my own benefit; I write mathematics so that others long after I'm dead might look at my work and learn something.

Why would I then write as though people reading my work already know it? After all, no one publishes proofs that already known . . ."

You might be an elitist for whom simple explanations are insufficiently academic to demonstrate a point, but that's your own problem.

I care not if you consider something to be beneath you, or unworthy of your imagined intellectual wherewithal.

I guess that I should just start assuming that every random anonymous internet person who comments here is a fucking PhD mathematician, and that I should write at that level.

No, it couldn't possibly be that I care more about spreading information on subjects that strike my fancy from one day to the next than impressing someone who is literally meaningless in my life. Yes, I should definitely start making sure that I change my disposition on how I want to a.) write something poorly, b.) on purpose, c.) that is an example of d.) something that is poorly written.

What was I thinking?

And your response to the whole fucking example of a poorly written proof is to, wait for it, take me to task for its fucking being poorly written like some kind of volcano that prematurely erupts because a butterfly in goddamned Japan lands on a fence.

If you find me writing a sincere proof to prove something that doesn't come as the result of "how would it look if it weren't written well?", then you'll have the ghost of a point that I *might* possibly not be a mathematician. Of course, this excludes the fact that as a person I might just, you know, not type perfectly well in all circumstances.

But let's review what information is available on my blog here:
1.) I had surgery a week ago
2.) because of 1.) I'm fucking on strong ass pain killers
3.) I was asked what my Laplace type poorly written proofs would like
4.) I made one up in like seconds
5.) in the comments section of my blog
6.) that wasn't meant to be a freestanding post thought and planned out
7.) you're a complete dick by being "pedantic" about the in advance conceded-by-me to be a poorly written proof as being, you know, not written well enough to be poorly written in the right way.

Proof writing is by design a pedantic enterprise of technical, necessary obsequiousness. What one author might find too trivial to warrant explication will not be true for another author. In the example of that very feature you want to play "gotcha".

*lesson learned*