- Active Posts:
- 1,563 (0.5 per day)
- Most Active In:
- Physics and Mathematics (584 posts)
- 05-June 05
- Profile Views:
- Last Active:
- Jun 23 2013 09:50 PM
- Member Title:
- Age Unknown
- April 28
- Not Telling
- Click here to e-mail me
Topics I've Started
28 January 2012 - 02:01 AMI'm not sure what thread exactly to put this in, so since my experiences are in physics, its here.
So, some background- I finished a phd in theoretical physics a few years back. After finishing, I applied to industry jobs, liberal arts college positions, etc. I couldn't land any sort of "real" job, so I took a postdoc at a prestigious university. After finishing the postdoc, I applied for jobs again. No bites on the academic front. My previous experience had taught me a bitter truth- no engineering company (at least in the US) will hire a theoretical physicist when there are plenty of engineers to fill positions. Why take the time training a physicist when you can hire an engineer ready to go day one? Still, I tried again and again. Highschool's won't touch me because they don't want to pay more for a phd who has no highschool teaching experience. For historical reasons, my research coding was mostly fortran 77, so my programming portfolio lacks object oriented code, so even trying to get a standard programming job has been difficult. After several months of unemployment, I recently landed an actuarial type job at an investment bank.
My career in physics lasted nearly 10 years, the last few as a postdoc. I never got a chance to direct my own research, I always worked for someone else. I'm 30 and I have never made more than the manager of a McDonald's despite holding a phd. In the job I just started, I will make roughly the average salary of a recent college graduate, and I use nothing of what I spent the last decade learning. Economically, I threw the last ten years down a hole. I drive a twenty year old car, I have no savings, I have only recently begun to pay down my undergraduate debt.
AND, THIS IS THE NORMAL OUTCOME FOR A PHYSICS PHD. My classmates who have graduated have nearly all left science. They are working for banks or insurance companies because technical companies won't hire them. I'd estimate that 1/10 got the coveted tenure track position, and another 1/10 got an industry job where they could do some science. The other 4/5 of us are working jobs we could have had after undergrad, at the same pay we could have had after undergrad.
And yet, I hear everyday that we have some scientist or engineering shortage in the US. If that were true, someone with my broad technical background should be able to step into one of any number of positions that companies are having trouble filling. This is not the reality. There is no scientist shortage, all the reports that say scientists will soon be retiring, etc are wrong- the retirement peak happened years ago. We train substantially more phds than jobs, and we have been since the 70s.
So why does this myth of a shortage get perpetuated? What does encouraging people to throw away what could be an otherwise productive decade of their lives accomplish?
Paradoxically, we not only train too many scientists, but also have a largely scientifically illiterate populace. Should we stop encouraging people to go into science? If we allow an actual shortage of scientists to develop, the rising salaries and more stable career path for scientists might raise not only their job prospects but also their social status. In a world where people are competing for the good science jobs (instead of the good business jobs), more people would have legitimate incentives to study science, and perhaps we'd have a more scientifically literate populace.
By encouraging students by misleading them about the career opportunities are we (counterproductively) creating a less scientifically-literate populace?
16 February 2010 - 10:05 AMI have heard for the last decade or so that the US in particular has a shortage of people going in to math and science, and this weakness will somehow hurt the US in the future.
However, in reality the case appears to be just the opposite. There are so few jobs in mathematics and the physical sciences that any jobs that open up are intensely competitive. The average phd spends 6-10 years in low paying postdoctoral positions waiting for an academic position to open up, and the competition for these rare spots is fierce. Even liberal arts and community colleges have become competitive when faced with this huge glut of scientists. The majority of phd graduates leave the field, and go into finance, consulting, project management, etc.
So why do we, as a society, perpetuate this myth that we need more people in mathematics and the physical sciences? Was there ever a shortage?
24 September 2008 - 07:15 AMModeration note: This thread was created from posts moved from the thread Large Hadron Collider.
7DSUSYstrings said:The other thing I'm exploring is adding helium to our atmosphere to reinforce the inert gas layer that, by my calculations, is MUCH thinner than it was 150 years ago. Perhaps dangerously thin. The problem is the shortage of the gas.
The other problem is that helium cannot be gravitationally bound by our fairly small planet and escapes into space. This is (of course) the reason that helium is in short supply.
20 December 2007 - 02:41 PMI was going to post this on Craig's thread of a similar title, but realized that I could drag the thread off of its purpose. The purpose of my thread is to demonstrate that information/thermodynamic aren't just analogous but actually the same thing (i.e. information carries thermodynamic entropy). This line of reasoning owes everything to Charlie Bennett.
Note: I always work in units where temperature is measured in energy units, which means boltzmann's constant is 1. This means my entropy is dimensionless. I apologize if its confusing, but its what I'm used to.
Lets examine the following simple device (courtesy of Leo Szilard): A molecule is is trapped in a chamber with a shutter in the middle,and moveable pistons at each end.
Now, imagine that we quickly close the shutter (the particle is either on the left, or right sound).
We now have a machine that can measure which side the particle is on (left or right), and pushes the other movable piston all the way closed. Pushing this piston closed takes no work, as there is no particle there to offer resistance.
Next, we open the shutter, allowing the particle to expand the piston isothermally. This lets us extract some useful work, as follows (p is pressure, V is volume)
In the second line, we used PV=NT to replace pressure with volume to do our integral. Its isothermal, so temperature is constant.
So, in our case, . Now, we can close the shutter and start all over again. However, since we have extracted useful work from our engine, its entropy must have changed. Using the thermodynamic formula we see that our entropy has actually dropped by , but nothing else has changed! How can we save the second law, which tells us total entropy always must increase?
The answer lies in the following- we measured whether the particle was in the left or right side, and we had to store this information somewhere. At best, this information can be stored in 1 bit of information, which Shannon would tell us has information entropy . So the second law is saved because information entropy equals thermodynamic entropy!
In any physical situation, we have to store bits in a storage medium, which is limited. So we can't extract useful work forever, eventually we have to wipe the harddrive- and this will take an input energy! Bare minimum, it will take the energy we just harvested from our engine.
edit: minor edits, for (hopefully) clarity
15 November 2007 - 06:48 AMThere has been some indication on other threads that perhaps the Higgs mechanism/Higgs boson are rather poorly understood. Hence, this thread. I'm going to attempt a sort of "question and answer" on the Higgs. Unfortunately, before I can really talk about the Higgs I'll need to give a whirlwind tour of some important ideas in particle physics. I'm purposefully brushing past a lot of subtleties to paint the big picture, and there will be some things you may have to trust me on.
First, we'll need to develop some concepts. The first is the idea of a Lagrangian. One of the more interesting developments of the mid 1800s is the idea that classical physics can be described by the statement "particles minimize their action" where the action(S) is defined as
That funny looking L is the Lagrangian, and it is fundamental in modern physics. It turns out that the way to generalize classical physics concepts (like electricity and magnetism) to the quantum world is to figure out the Lagrangian for the classical theory and work with it. In classical mechanics it is standard to think of the Lagrangian as
where T is kinetic and V potential energy. In classical mechanics the Lagrangian is always minimized, but in quantum mechanics, we can have fluctuations away from the minimum.
The important things in the Lagrangian of particle physics are called fields, and they represent the types of particles we can have. Simple functions (called scalars) represent spin 0 particles, if they are complex that means the particle is charged. Vector represent spin 1 particles, and there are objects called "spinors" that represent spin 1/2 particles that I will do my best to avoid, as they are a bit more complicated.
The next concept we'll need is symmetry. This is the primary tool that helps us build the Lagrangians for our theories. Generally, what we do is we decide on a symmetry our theory should have, and then use for our Lagrangian every possible term that has such symmetry. As a concrete example consider that what we have a theory with a spin 0, charged particle. This is represented by a complex function . There will be a term in the Lagrangian that looks like . This is called a mass term- any term that is quadratic in a field (i.e. the field squared) generates a mass for the field.
This term has a symmetry: we can multiply by . If we do this we get
This type of symmetry is called a U(1) symmetry. Notice that we could also allow to be a function of x and t. In other words, we could use as slightly different transformation at each point in space and time. This is called a "gauge symmetry."
Next, consider the term The i labels which coordinate we are taking a derivative with respect to (x,y,z, or even t) This term wouldn't be in a lagrangian but will be a useful example. What happens if we try to do a gauge symmetry on this guy?
Oh no! It doesn't have a gauge symmetry, because of the derivative involved! We will want derivatives in our Lagrangian, so what can we do? The solution is to add a new field, other then . We'll call it A_i. We also need a transformation law for A_i, when we do a gauge transformation of then
So now we could have a term like This will be gauge invariant.
But notice! In order to "rescue" gauge invariance, we had to add a whole new field, this means we have added a whole new particle type! Further, its not just a function, its a vector! That means it describes a spin 1 particle. This is a "gauge boson" its a particle that we had to add to satisfy the gauge symmetry.
Does it have mass? Well can we add a term to the Lagrangian like ? The answer is no- it will not satisfy gauge invariance! Try it using the transformation law for A.
Hence, to satisfy gauge invariance, we had to postulate spin 1, massless bosons. BUT, we know the W and Z bosons have masses. How can we understand this? We'll have to understand the idea of spontaneous symmetry breaking. This will have to be next post. For now, feel free to ask questions and I'll try to answer.