Monday, September 18, 2017

Faculty position at Rice - theoretical astro-particle/cosmology

Assistant Professor Position at Rice University in

Theoretical Astro-Particle Physics/Cosmology


The Department of Physics and Astronomy at Rice University in Houston, Texas, invites applications for a tenure-track faculty position (Assistant Professor level) in Theoretical Astro-Particle physics and/or Cosmology. The department seeks an outstanding individual whose research will complement and connect existing activities in Nuclear/Particle physics and Astrophysics groups at Rice University (see http://physics.rice.edu). This is the second position in a Cosmic Frontier effort that may eventually grow to three members. The successful applicant will be expected to develop an independent and vigorous research program, and teach graduate and undergraduate courses. A PhD in Physics, Astrophysics or related field is required.

Applicants should send the following: (i) cover letter; (ii) curriculum vitae (including electronic links to 2 relevant publications); (iii) research statement (4 pages or less); (iv) teaching statement (2 pages or less); and (v) the names, professional affiliations, and email addresses of three references.  To apply, please visit: http://jobs.rice.edu/postings/11772.  Applications will be accepted until the position is filled, but only those received by Dec 15, 2017 will be assured full consideration. The appointment is expected to start in July 2018.  Further inquiries should be directed to the chair of the search committee, Prof. Paul Padley (padley@rice.edu).

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.


-->
         

Faculty position at Rice - experimental condensed matter

Faculty Position in Experimental Condensed Matter Physics Rice University


The Department of Physics and Astronomy at Rice University in Houston, TX invites applications for a tenure-track faculty position in experimental condensed matter physics.  The department expects to make an appointment at the assistant professor level. This search seeks an outstanding individual whose research interest is in hard condensed matter systems, who will complement and extend existing experimental and theoretical activities in condensed matter physics on semiconductor and nanoscale structures, strongly correlated systems, topological matter, and related quantum materials (see http://physics.rice.edu/). A PhD in physics or related field is required. 

Applicants to this search should submit the following: (1) cover letter; (2) curriculum vitae; (3) research statement; (4) teaching statement; and (5) the names, professional affiliations, and email addresses of three references. For full details and to apply, please visit: http://jobs.rice.edu/postings/11782. Applications will be accepted until the position is filled. The review of applications will begin October 15 2017, but all those received by December 1 2017 will be assured full consideration. The appointment is expected to start in July 2018.  Further inquiries should be directed to the chair of the search committee, Prof. Emilia Morosan (emorosan@rice.edu).  

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

Friday, September 15, 2017

DOE experimental condensed matter physics PI meeting, day 3

And from the last half-day of the meeting:

  • Because the mobile electrons in graphene have an energy-momentum relationship similar to that of relativistic particles, the physics of electrons bound to atomic-scale defects in graphene has much in common with the physics that sets the limits on the stability of heavy atoms - when the kinetic energy of the electrons in the innermost orbitals is high enough that relativistic effects become very important.  It is possible to examine single defect sites with a scanning tunneling microscope and look at the energies of bound states, and see this kind of physics in 2d.  
  • There is a ton of activity concentrating on realizing Majorana fermions, expected to show up in the solid state when topologically interesting "edge states" are coupled to superconducting leads.  One way to do this would be to use the edge states of the quantum Hall effect, but usually the magnetic fields required to get in the quantum Hall regime don't play well with superconductivity.  Graphene can provide a way around this, with amorphous MoRe acting as very efficient superconducting contact material.  The results are some rather spectacular and complex superconducting devices (here and here).
  • With an excellent transmission electron microscope, it's possible to carve out atomically well defined holes in boron nitride monolayers, and then use those to create confined potential wells for carriers in graphene.  Words don't do justice to the fabrication process - it's amazing.  See here and here.
  • It's possible to induce and see big collective motions of a whole array of molecules on a surface that each act like little rotors.
  • In part due to the peculiar band structure of some topologically interesting materials, they can have truly remarkable nonlinear optical properties.
My apologies for not including everything - side discussions made it tough to take notes on everything, and the selection in these postings is set by that and not any judgment of excitement.  Likewise, the posters at the meeting were very informative, but I did not take notes on those.

Wednesday, September 13, 2017

DOE experimental condensed matter PI meeting, day 2

More things I learned:

  • I've talked about skyrmions before.  It turns out that by coupling a ferromagnet to a strong spin-orbit coupling metal, one can stabilize skyrmions at room temperature.  They can be visualized using magnetic transmission x-ray microscopy - focused, circularly polarized x-ray studies.   The skyrmion motion can show its own form of the Hall effect.  Moreover, it is possible to create structures where skyrmions can be created one at a time on demand, and moved back and forth in a strip of that material - analogous to a racetrack memory.
  • Patterned arrays of little magnetic islands continue to be a playground for looking at analogs of complicated magnetic systems.  They're a kind of magnetic metamaterial.  See here.  It's possible to build in frustration, and to look at how topologically protected magnetic excitations (rather like skyrmions) stick around and can't relax.
  • Topological insulator materials, with their large spin-orbit effects and surface spin-momentum locking, can be used to pump spin and flip magnets.  However, the electronic structure of both the magnet and the TI are changed when one is deposited on the other, due in part to interfacial charge transfer.
  • There continues to be remarkable progress on the growth and understanding of complex oxide heterostructures and interfaces - too many examples and things to describe.
  • The use of nonlinear optics to reveal complicated internal symmetries (talked about here) continues to be very cool.
  • Antiferromagnetic layers can be surprisingly good at passing spin currents.  Also, I want to start working on yttrium iron garnet, so that I can use this at some point in a talk.
  • It's possible to do some impressive manipulation of the valley degree of freedom in 2d transition metal dichalcogenides, creating blobs of complete valley polarization, for example.  It's possible to use an electric field to break inversion symmetry in bilayers and turn some of these effects on and off electrically.
  • The halide perovskites actually can make fantastic nanocrystals in terms of optical properties and homogeneity.

Tuesday, September 12, 2017

DOE experimental condensed matter PI meeting, day 1

I'm pressed for time, so this is brief, but here are some things I learned yesterday:
  • An electric field perpendicular to the plane can split and shift the Landau levels of bilayer graphene.  See here.
  • The quantum Hall effect in graphene and other 2d systems still has a lot of richness and life in it.
  • I have one word for you...."polaritons".
  • It's possible to set up a tunneling experiment, from one "probe" 2d electron gas that has a small, tight Fermi surface, into a "sample" 2d electron gas of interest.  By playing with the in-plane magnetic field, the tunneling electrons can pick up momentum in the plane as they tunnel.  The result is, the tunneling current as a function of voltage and transverse fields lets you map out exactly the "sample" electronic states as a function of energy and momentum, like ARPES without the PES part.  See here.
  • Squeezing mechanically to apply pressure can actually produce dramatic changes (quantum phase transitions) in unusual fractional quantum Hall states.
  • How superconductivity dies in the presence of disorder, magnetic field, and temperature remains very rich and interesting.  The "Bose metal", when magnetic field kills global phase coherence without completely ripping apart Cooper pairs, can be an important part of that transition.  For related work, see here.
  • One should be very careful in interpreting ARPES data.  It's entirely possible that not everything identified as some exotic topological material really fits the bill - see here.  On the other hand, sometimes you do see real topologically interesting band structure.
  • The DOE still has laptops running Windows XP.

Sunday, September 10, 2017

DOE Experimental Condensed Matter PI meeting, 2017

The Basic Energy Sciences program is part of the US Department of Energy's Office of Science, and they are responsible for a lot of excellent science research funding.  The various research areas within BES have investigator meetings every two years, and at the beginning of this coming week is the 2017 PI meeting for the experimental condensed matter physics program.  As I've done in past years,  I will try to write up a bulleted list of things I learn.   (See here, here, and here for the 2013 meeting; see here, here, here, and here for the 2015 meeting).

Good luck and stay safe to those in Florida about to get hit by Hurricane Irma.  It's very different than Harvey (much more of a concern about wind damage and storm surge, much less about total rainfall), but still very dangerous.

Lastly, Amazon seems to have my book available for a surprisingly low price right now ($62, though the list is $85).  I (and my publisher) still have no idea how they can do this without losing money.  

Sunday, September 03, 2017

Capillary action - the hidden foe in the physics of floods

There is an enormous amount of physics involved in storms and floods.   The underlying, emergent properties of water are key to much of this.

An individual water molecule can move around, and it can vibrate and rotate in various ways, but it's not inherently wet.  Only when zillions of water molecules get together does something like "wetness" of water even take on meaning.  The zillions of molecules are very egalitarian:  They explore all possible microscopic arrangements (including how they're distributed in space and how they're moving) that are compatible with their circumstances (e.g., sitting at a particular temperature and pressure).  Sometimes the most arrangements correspond to the water molecules being close together as a liquid - the water molecules are weakly attracted to each other if they get close together; at other temperatures and pressures, the most arrangements correspond to the water molecules being spread out as a gas.    Big tropical systems are basically heat engines, powered by the temperature difference between the surface layers of seawater and the upper atmosphere.  That difference in temperatures leads to net evaporation, driving water into the gas phase (by the gigaton, in the case of Hurricane Harvey).  Up in the cold atmosphere, the water condenses again into droplets, and heating the air.  If those droplets are small enough, the forces from adjacent air molecules bouncing off the droplets slow the droplets to the point where they are borne aloft by large-scale breezes - that's why clouds don't fall down even though they're made of water droplets.

There is another feature that comes from the attraction between water molecules and each other, and the attraction between water molecules and their surroundings.   Because of the intramolecular attraction, water molecules would have less energy if they were close together, and therefore having a water-air interface costs energy.  One result is surface tension - the tendency for liquid droplets to pull into small blobs that minimize their (liquid/vapor interface) surface area.

However, sometimes the attractive interaction between a water molecule and some surface can be even stronger than the interaction between the water molecule and other water molecules.  When that happens, a water droplet on such a surface will spread out instead of "beading up".  The surface is said to be hydrophilic.  See here.  This is why some surfaces "like" to get wet, like your dirty car windshield.

Sneaking in here is actually the hidden foe that is known all too well to those who have ever dealt with flooding.  You've seen it daily, even if you've never consciously thought about it.  It's capillary action.  A network of skinny pores or very high surface area hydrophilic material can wick up water like crazy.  Again, the water is just exploring all possible microscopic arrangements, and it so happens that in a high surface area, hydrophilic environment, many many arrangements involve the water being spread out as much as possible on that surface.  This can be to our advantage sometimes - it helps get water to the top of trees, and it makes paper towels work well for drying hands.  However, it can also cause even a couple of cm of floodwater indoors to ruin the bottom meter of sheetrock, or bring water up through several cm of insulation into wood floors, or transport water meters up carpeted stairs.   Perhaps it will one day be economically and environmentally feasible to make superhydrophobic wall and flooring material, but we're not there yet.

(To all my Houston readers, I hope you came through the storm ok!  My garage had 0.8m of water, which killed my cars, but the house is otherwise fine, and the university + lab did very well.)

Friday, August 25, 2017

Hurricanes, heat engines, etc.

Looks like it's going to be a wet few days, with the arrival of Harvey.   I've mentioned previously that hurricanes and tropical storm systems are heat engines - they basically use the temperature difference between the heated water in the ocean and the cooler air in the upper atmosphere to drive enormous flows of matter (air currents, water in vapor and liquid form).  A great explanation of how this works is here.  Even with very crude calculations one can see that the power involved in a relatively small tropical rain event is thousands GW, hundreds of times greater than the power demands of a major city.   Scaling up to a hurricane, you arrive at truly astonishing numbers.  It's likely that Harvey is churning along at an average power some 200 times greater than the electrical generating capacity of the planet (!).  Conservative predictions right now are for total rainfall of maybe 40 cm across an area the size of the state of Louisiana, which would be a total amount of 5.2e10 metric tons of water.   Amazing.  I'm planning to write more in the future about some of this, time permitting.

Update:  For what it's worth, Vox has an article about Harvey, and they say it deposited 14-15 trillion gallons of water.  Each gallon is 3.78 kg, meaning that the total mass of water deposited for 14 trillion gallons is 5.3e10 metric tons.  How's that for estimating accuracy in the above?


Friday, August 18, 2017

Invited symposia/speaker deadlines, APS 2018

For those readers who are APS members, a reminder.  The deadline for nominations for invited symposia for the Division of Condensed Matter Physics for the 2018 March Meeting is August 31.   See here.

Likewise, the Division of Materials Physics has their invited speaker (w/in focus topic) deadline of August 29.  See here.

Please nominate!

Power output of a lightsaber

It's been a long week and lots of seriously bad things have been in the news.   I intend to briefly distract myself by looking at that long-standing question, what is the power output of a lightsaber?  I'm talking about the power output when the lightsaber is actually slicing through something, not just looking cool.  We can get a very rough, conservative estimate from the documentary video evidence before us.  I choose not to use the prequels, on general principle, though the scene in The Phantom Menace when Gui-Gon Jinn cuts through the blast door would be a good place to start.  Instead, let's look at The Force Awakens, where Kylo Ren throws a tantrum and slices up an instrument panel, leaving behind dripping molten metal.  

With each low-effort swing of his arm, Ren's lightsaber, with a diameter of around 2 cm moves at something more than 2 m/s, slicing metal to a depth of, say, 3 cm (actually probably deeper than that).  That is, the cutting part of the blade is sweeping out a volume of around 1200 cc/sec.  It is heating that volume of console material up to well above its melting point, so we need to worry about the energy it takes to heat the solid from room temperature (300 K) up to its melting point, and then the heat of fusion required to melt the material.  At a rough guess, suppose imperial construction is aluminum.  Aluminum has a specific heat of 0.9 J/g-K, and a density of 2.7 g/cc when solid, a melting point of 933 K, and a heat of fusion of 10.7 kJ/mol.  In terms of volume, that's (10.7 kJ/mol)(1 mol/27 g)(2.7 g/cc) = 1070 J/cc.  So, the total power is around (933K-300K)*(0.9J/g-K)*(2.7g/cc)*(1200 cc/sec) + (1070 J/cc)(1200 cc/sec) = 3.1e6 J/s = 3.1 MW.  

Hot stuff.

Thursday, August 10, 2017

That's the way the ball bounces.

How does a ball bounce?  Why does a ball, dropped from some height onto a flat surface, not bounce all the way back up to its starting height?  The answers to these questions may seem obvious, but earlier this week, this paper appeared on the arxiv, and it does a great job of showing what we still don't understand about this everyday physics that is directly relevant for a huge number of sports.

The paper talks specifically about hollow or inflated balls.  When a ball is instantaneously at rest, mid-bounce, it's shape has been deformed by its interaction with the flat surface.  The kinetic energy of its motion has been converted into potential energy, tied up in a combination of the elastic deformation of the skin or shell of the ball and the compression of the gas inside the ball.  (One surprising thing I learned from that paper is that high speed photography shows that the non-impacting parts of such inflated balls tend to remain spherical, even as part of the ball deforms flat against the surface.)  That gas compression is quick enough that heat transfer between the gas and the ball is probably negligible.  A real ball does not bounce back to its full height; equivalently, the ratio \(v_{f}/v_{i}\) of the ball's speed immediately after the bounce, \(v_{f}\), to that immediately before the bounce, \(v_{i}\), is less than one.  That ratio is called the coefficient of restitution.

Somehow in the bounce process some energy must've been lost from the macroscopic motion of the ball, and since we know energy is conserved, that energy must eventually show up as disorganized, microscopic energy of jiggling atoms that we colloquially call heat.   How can this happen?

  • The skin of the ball might not be perfectly elastic - there could be some "viscous losses" or "internal friction" as the skin deforms.
  • As the ball impacts the surface, it can launch sound waves into the surface that eventually dissipate.
  • Similarly, the skin of the ball itself can start vibrating in a complicated way, eventually damping out to disorganized jiggling of the atoms.
  • As the ball's skin hits the ground and deforms, it squeezes air out from beneath the ball; the speed of that air can actually exceed the speed of sound in the surrounding medium (!), creating a shock wave that dissipates by heating the air, as well as ordinary sound vibrations.  (It turns out that clapping your hands can also create shock waves!  See here and here.)
  • There can also be irreversible acoustic process in the gas inside the ball that heat the gas in there.
This paper goes through all of these, estimates how big those effects are, and concludes that, for many common balls (e.g., basketballs), we actually don't understand the relative importance of these different contributions.  The authors propose some experiments to figure out what's going on.  The whole thing is a nice exercise in mechanics and elasticity, and it's always fun to realize that there may still be some surprises lurking in the physics of the everyday.

Saturday, August 05, 2017

Highlights from Telluride

Here are a few highlights from the workshop I mentioned.  I'll amend this over the next couple of days as I have time.  There is no question that smaller meetings (this one was about 28 people) can be very good for discussions.
  • I learned that there is a new edition of Cuevas and Scheer that I should pick up.  (The authors are Juan Carlos Cuevas and Elke Scheer, a great theorist/experimentalist team-up.)
  • Apparently it's possible to make a guitar amplifier using tunnel junctions made from self-assembled monolayers.  For more detail, see here.
  • Some folks at Aachen have gotten serious about physics lab experiments you can do with your mobile phone.
  • Richard Berndt gave a very nice talk about light emission from atomic-scale junctions made with a scanning tunneling microscope.  Some of that work has been written about here and here.  A key question is, when a bias of \(eV\) is applied to such a junction, what is the mechanism that leads to the emission of photons of energies \(\hbar \omega > eV\)?  Clearly the processes involve multiple electrons, but exactly how things work is quite complicated, involving both the plasmonic/optical resonances of the junction and the scattering of electrons at the atomic-scale region.  Two relevant theory papers are here and here.
  • Latha Venkataraman showed some intriguing new results indicating room temperature Coulomb blockade-like transport in nanoclusters.  (It's not strictly Coulomb blockade, since the dominant energy scale seems to be set by single-particle level spacing rather than by the electrostatic charging energy of changing the electronic population by one electron).
  • Katharina Franke showed some very pretty data on single porphyrins measured via scanning tunneling microscope, as in here.  Interactions between the tip and the top of the molecule result in mechanical deformation of the molecule, which in turn tunes the electronic coupling between the transition metal in the middle of the porphyrin and the substrate.  This ends up being a nice system for tunable studies of Kondo physics.
  • Uri Peskin explained some interesting recent results that were just the beginning of some discussions about what kind of photoelectric responses one can see in very small junctions.  One recurring challenge:  multiple mechanisms that seem to be rather different physics can lead to similar experimentally measurable outcomes (currents, voltages).
  • Jascha Repp discussed some really interesting experiments combining STM and THz optics, to do true time-resolved measurements in the STM, such as watching a molecule bounce up and down on a metal surface (!).  This result is timely (no pun intended), as this remarkable paper just appeared on the arxiv, looking at on-chip ways of doing THz and faster electronics.
  • Jeff Neaton spoke about the ongoing challenge of using techniques like density functional theory to calculate and predict the energy level alignment between molecules and surfaces to which they're adsorbed or bonded.  This is important for transport, but also for catalysis and surface chemistry broadly.  A relevant recent result is here.
  • Jan van Ruitenbeek talked about their latest approach to measuring shot noise spectra in atomically small structures up to a few MHz, and some interesting things that this technique has revealed to them at high bias.  
  • There were multiple theory talks looking at trying to understand transport, inelastic processes, and dissipation in open, driven quantum systems.  Examples include situations where higher driving biases can actually make cooling processes more efficient; whether it's possible to have experiments in condensed matter systems that "see" many-body localization, an effect most explored in cold atom systems; using ballistic effects in graphene to do unusual imaging experiments or make electronic "beam splitters"; open systems from a quantum information point of view; what we mean by local effective temperature on very small scales; and new techniques for transport calculations. 
  • Pramod Reddy gave a really nice presentation about his group's extremely impressive work measuring thermal conduction at the atomic scale.  Directly related, he also talked about the challenges of measuring radiative heat transfer down to nm separations, where the Stefan-Boltzmann approach should be supplanted by near-field physics.  This was a very convincing lesson in how difficult it is to ensure that surfaces are truly clean, even in ultrahigh vacuum.
  • Joe Subotnik's talk about electronic friction was particularly striking to me, as I'd been previously unaware of some of the critical experiments (1, 2).  When and how do electron-hole excitations in metals lead to big changes in vibrational energy content of molecules, and how to think about this.  These issues are related to these experiments as well.  
  • Ron Naaman spoke about chiral molecules and how electron transfer to and from these objects can have surprising, big effects (see here and here).
  • Gemma Solomon closed out the proceedings with a very interesting talk about whether molecules could be used to make effective insulating layers better at resisting tunneling current than actual vacuum, and a great summary of the whole research area, where it's been, and where it's going.

Thursday, August 03, 2017

Workshop on quantum transport

Blogging has been slow b/c of travel.  I'm attending a workshop on "Quantum transport in nanoscale molecular systems".  This is rather like a Gordon Conference, with a fair bit of unpublished work being presented, but when it's over I'll hit a few highlights that are already in the literature.  Update:  here you go.

Sunday, July 23, 2017

Several items - the arxiv, "axial-gravitational" fun, topology

Things have been a bit busy, but here are a few items that have popped up recently:
  • Symmetry magazine is generally insightful and well-written.   Recently they posted this amusing article looking at various fun papers on the arxiv.  Their first example reminds me of this classic.
  • Speaking of the arxiv, it's creator, Paul Ginsparg, posted this engaging overview recently.  It's not an overstatement to say that the arxiv has had an enormous impact on science over the last 25 years.
  • There has been a huge amount of media attention on this paper (arxiv version).  The short version:  In high energy physics there is a certain conservation principle regarding chiral (meaning that the particle spin is directed along its momentum) massless fermions, so that ordinarily these things are produced so that there is no net excess of one handedness of spin over the other.  There is a long-standing high energy theory argument that in curved spacetime, the situation changes and you can get an excess of one handedness - a "chiral anomaly".  It is difficult to see how one could test this directly via experiment, since in our daily existence spacetime curvature is pretty minimal, unlike, say, near the event horizon of a small blackhole.  However, solid state materials can provide a playground for some wild ideas.  The spatial arrangement of atoms in a crystalline solid strongly affects the dispersion relation, the relationship between energy and (the crystal analog of) momentum.  For example, the linear dispersion relation between energy and momentum in (neutral) graphene makes the electrons behave in some ways analogous to massless relativistic particles, and lets people do experiments that test the math behind things like Klein tunneling.  As a bonus, you can add in spin-orbit coupling in solids to bring spin into the picture.  In this particular example, the electronic structure of NbP is such that, once one accounts for the spatial symmetries and spin-orbit effects, and if the number of electrons in there is right, the low-energy electronic excitations are supposed to act mathematically like massless chiral fermions (Weyl fermions).  Moreover, in a temperature gradient, the math looks like that used to describe that gravitational anomaly I'd mentioned above, and this is a system where one can actually do measurements.  However, there is a lot of hype about this, so it's worth stating clearly:  gravity itself does not play a role in NbP or this experiment.  Also, I have heard concerns about the strength of the experimental interpretation, because of issues about anisotropy in the NbP material and the aspect ratio of the sample.  
  • Similarly, there is going to be a lot of media attention around this paper, where researchers have combined a material ((Cr0.12Bi0.26Sb0.62)2Te3) that acts like a kind of topological insulator (a quantum anomalous Hall insulator, to use the authors' particular language) and a superconductor (Nb).  The result is predicted to be a system where there is conduction around the edges with the low energy current-carrying excitations act like Majorana fermions, another concept originally invented in the context of high energy physics.  
  • Both of these are examples of a kind of topology mania going on in condensed matter physics these days, as described here.  This deserves a longer discussion later.  

Sunday, July 16, 2017

A thermoelectric surprise in metals

Earlier this year I'd described what thermoelectricity is, and I'd also discussed recent work of ours where we used a laser as a scan-able heat source, and were then able to see nicely the fact that changing the size of a nanoscale metal structure can vary the material's thermoelectric properties, and make a thermocouple out of a single metal.

With this same measurement technique, we found a result that we thought was rather strange and surprising, which we have written up here.   Take a moderately long wire, say 120 nm wide and several microns long, made by patterning a 15 nm thick Au film.  Hook up basically a volt meter to the ends of the wire, and scan the laser spot along the length of the wire, recording the voltage as a function of the laser position.  If the wire is nice and homogeneous, you'd expect not to see to much until you get to the ends of the wire where it widens out into bigger contacts.  (There the size variation should make the skinny/wide junction act like a thermocouple.)   Instead, we see the result shown here in the figure (fig. 2 of the paper).  There is a great deal of spatial variability in the photothermoelectric voltage, like the wire is actually made up of a whole bunch of little thermocouples!

Note that your eye tends to pick out a spatial scale in panel (a) comparable to the 1 micron scale bar.  That's a bit misleading; the spot size of the laser in our system is about 1.8 microns, so this measurement approach would not pick up much smaller spatial scales of variation.

The metal wire is polycrystalline, and if you look at the electron microscope images in panels (c, d, e) you can make out a grain structure with lateral grain sizes of 15-20 nm.  Maybe the wire isn't all that homogeneous?  One standard way physicists look at the quality of metal films is to consider the electrical resistance of a square patch of film (\(R_{\square}\), the "sheet resistance" or "resistance per square"), and compare that number with the "resistance quantum", \(R_{\mathrm{q}}\equiv h/2e^2\), a combination of fundamental constants that sets a scale for resistance.  If you had two pieces of metal touching at a single atom, the resistance between them would be around the resistance quantum.  For our wire material, \(R_{\square}\) is a little under 4 \(\Omega\), so \(R_{\square} << R_{\mathrm{q}}\), implying that the grains of our material are very well-connected - that it should act like a pretty homogeneous film.  This is why the variation shown in the figure is surprising.  Annealing the wires does change the voltage pattern as well as smoothing it out.  This is a pretty good indicator that the grain boundaries really are important here.  We hope to understand this better - it's always fun when a system thought to be well understood surprises you.





Friday, July 07, 2017

Two books that look fun

Two books that look right up my alley:

  • Storm in a Teacup by Helen Czerski.  Dr. Czerski is a researcher at University College London, putting her physics credentials to work studying bubbles in physical oceanography.  She also writes the occasional "everyday physics" column in the Wall Street Journal, and it's great stuff.
  • Max the Demon vs. Entropy of Doom by Assa Auerbach and Richard Codor.   Prof. Auerbach is a serious condensed matter theorist at the Technion.  This one is a kick-starter to produce a light-hearted graphic novel that is educational without being overly mathematical.  Looks fun.  Seems like the target audience would be similar to that for Spectra.

Thursday, July 06, 2017

Science and policy-making in the US

Over twenty years ago, Congress de-funded its Office of Technology Assessment, which was meant to be a non-partisan group (somewhat analogous to the Congressional Budget Office) that was to help inform congressional decision-making on matters related to technology and public policy.  The argument at the time of the de-funding was that it was duplicative - that there are other federal agencies (e.g., DOE, NSF, NIH, EPA, NOAA) and bodies (the National Academies) that are capable of providing information and guidance to Congress.   In addition, there are think-tanks like the Rand CorporationIDA, and MITRE, though those groups need direction and a "customer" for their studies.   Throughout this period, the executive branch at least had the Office of Science and Technology Policy, headed by the Presidential Science Advisor, to help in formulating policy.  The level of influence of OSTP and the science advisor waxed and waned depending on the administration.   Science is certainly not the only component of technology-related policy, nor even the dominant one, but for the last forty years (OSTP's existence) and arguably going back to Vannevar Bush, there has been broad bipartisan agreement that science should at least factor into relevant decisions.

We are now in a new "waning" limit, where all of the key staff offices at OSTP are vacant, and there seems to be no plan or timeline to fill them.     The argument from the administration, articulated in here, is that OSTP was redundant and that its existence is not required for science to have a voice in policy-making within the executive branch.   While that is technically true, in the sense that the White House can always call up anyone they want and ask for advice, removing science's official seat at the table feels like a big step.  As I've mentioned before, some things are hard to un-do.   Wiping out OSTP for at least the next 3.5 years would send a strong message, as does gutting the science boards of agencies.   There will be long-term effects, both in actual policy-making, and in continuity of knowledge and the pipeline of scientists and engineers interested in and willing to devote time to this kind of public service.   (Note that there is a claim from an unnamed source that there will be a new OSTP director, though there is no timeline.)

Thursday, June 29, 2017

Condensed matter/nano resources for science writers and journalists

I've been thinking about and planning to put together some resources about condensed matter physics and nanoscience that would be helpful for science writers and journalists.  Part of the motivation here is rather similar to that of doing outreach work with teachers - you can get a multiplicative effect compared to working with individual students, since each teacher interacts with many students.  Along those lines, helping science writers, journalists, and editors might have an impact on a greater pool than just those who directly read my own (by necessity, limited) writing.  I've had good exchanges of emails with some practitioners about this, and that has been very helpful, but I'd like more input from my readers.

In answer to a few points that have come up in my email discussions:

  • Why do this?  Because I'd like to see improved writing out there.  I'd like the science-interested public to understand that there is amazing, often deep physics around them all the time - that there are deep ideas at work deep down in your iphone or your morning cup of coffee, and that those are physics, too.  I know that high energy ("Building blocks of the universe!") and astro ("Origins of everything!  Alien worlds!  Black holes!") are very marketable.  I'd be happy to guide little more of the bandwidth toward condensed matter/materials/real nano (not sci-fi) popularization.  I think the perception that high energy = all of physics goes a long way toward explaining why so many people (incl politicians) think that basic research is pie-in-the-sky-useless, and everything else is engineering that should be funded by companies.  I do think online magazines like Quanta and sites like Inside Science are great and headed in a direction I like.  I wish IFLS was more careful, but I admire their reach.
  • What is the long-range audience and who are the stakeholders?  I'd like CMP and nano to reach a broad audience.  There are serious technically trained people (faculty, researchers, some policy makers) who already know a lot of what I'd write about, though some of them still enjoy reading prose that is well written.  I am more thinking about the educated lay-public - the people who watch Nova or Scientific American Frontiers or Mythbusters or Through The Wormhole (bleah) or Cosmos, or who read Popular Science or Discovery or Scientific American or National Geographic.  Those are people who want to know more about science, or at least aren't opposed to the idea.  I guess the stakeholders would be the part of the physics  and engineering community that work on solid state and nano things, but don't have the time or inclination to do serious popular communication themselves.  I think that community is often disserved by (1) the popular portrayal that high energy = all of physics and crazy speculative stuff = actual tested science; (2) hype-saturated press releases that claim breakthroughs or feel the need to promise "1000x faster computers" when real, fundamental results are often downplayed; and (3) a focus in the field that only looks at applications rather than properly explaining the context of basic research.
  • You know that journalists usually have to cover many topics and have very little time, right?  Yes.  I also know that just because I make something doesn't mean anyone would necessarily use it.  Hence, why I'm looking for input.   Maybe something like a CM/nano FAQ would be helpful.
  • You know that long-form non-fiction writers love to do their own topical research, right?  Yes, and if there was something I could do to help those folks save time and avoid subject matter pitfalls, I'd feel like I'd accomplished something.
  • You could do more writing yourself, or give regular tips/summaries to journalists and editors via twitter, your blog, etc.  That's true, and I plan to try to do more, but as I said at the top, the point is not for me to become a professional journalist (in the sense of providing breaking news tidbits) or writer, but to do what I can to help those people who have already chosen that vocation. 
  • You know there are already pros who worry about quality of science writing and journalism, right?  Yes, and they have some nice reading material.  For example, this and this from the Berkeley Science Review; this from the Guardian; this from the National Association of Science Writers.
So, writers and editors that might read this:  What would actually be helpful to you along these lines, if anything?  Some primer material on some topics more accessible and concise than wikipedia?


Tuesday, June 20, 2017

About grants: What are "indirect costs"?

Before blogging further about science, I wanted to explain something about the way research grants work in the US.  Consider this part of my series of posts intended to educate students (and perhaps the public) about careers in academic research.

When you write a proposal to a would-be source of research funding, you have to include a budget.  As anyone would expect, that budget will list direct costs - these are items that are clear research expenses.  Examples would include, say, $30K/yr for a graduate student's stipend, and $7K for a piece of laboratory electronics essential to the work, and $2K/yr to support travel of the student and the principal investigator (PI) to conferences.   However, budgets also include indirect costs, sometimes called overhead.  The idea is that research involves certain costs that aren't easy to account for directly, like the electricity to run the lights and air conditioning in the lab, or the costs to keep the laboratory building maintained so that the research can get done, or the (meta)costs for the university to administer the grant.  

So, how does the university to figure out how much to tack on for indirect costs?  For US federal grants, the magic (ahem) is all hidden away in OMB Circular A21 (wiki about it, pdf of the actual doc).  Universities periodically go through an elaborate negotiation process with the federal government (see here for a description of this regarding MIT), and determine an indirect cost rate for that university.  The idea is you take the a version of the direct costs ("modified total direct costs" - for example, a piece of equipment that costs more than $5K is considered a capital expense and not subject to indirect costs) and multiply by a negotiated factor (in the case of Rice right now, 56.5%) to arrive at the indirect costs.  The cost rates are lower for research done off campus (like at CERN), with the argument that this should be cheaper for the university.  (Effective indirect cost rates at US national labs tend to be much higher.)

Foundations and industry negotiate different rates with universities.  Foundations usually limit their indirect cost payments, arguing that they just can't afford to pay at the federal level.  The Bill and Melinda Gates Foundation, for example, only allows (pdf) 10% for indirect costs.   The effective indirect rate for a university, averaged over the whole research portfolio, is always quite a bit lower than the nominal A21 negotiated rate.  Vice provosts/presidents/chancellors for research at major US universities would be happy to explain at length that indirect cost recovery doesn't come close to covering the actual costs associated with doing university-based research.  

Indirect cost rates in the US are fraught with controversy, particularly now.  The current system is definitely complicated, and reasonable people can ask whether it makes sense (and adds administrative costs) to have every university negotiate its own rate with the feds.   It remains to be seen whether there are changes in the offing.

Saturday, June 17, 2017

Interesting reading material

Summer travel and other activities have slowed blogging, but I'll pick back up again soon.  In the meantime, here are a couple of interesting things to read:

  • Ignition!  An Informal History of Liquid Rocket Propellants (pdf) is fascinating, if rather chemistry-heavy.  Come for discussions of subtle side reactions involved in red fuming nitric acid slowly eating its storage containers and suggested (then rejected) propellants like dimethyl mercury (!!), and stay for writing like, "Miraculously, nobody was killed, but there was one casualty — the man who had been steadying the cylinder when it split. He was found some five hundred feet away, where he had reached Mach 2 and was still picking up speed when he was stopped by a heart attack."  This is basically the story from start to finish (in practical terms) of the development of liquid propellants for rockets.   That book also led me to stumbling onto this library of works, most of which are waaaaay too chemistry-oriented for me.  Update:  for a directly relevant short story, see here.
  • Optogenetics is the idea of using light to control and trigger the activation/inactivation of genes.  More recently, there is a big upswing in the idea of magnetogenetics, using magnetic fields to somehow do similar things.  One question at play is, what is the physical mechanism whereby magnetic fields can really do much at room temperature, since magnetic effects tend to be weak.  (Crudely speaking, the energy scale of visible photons is eV, much larger than the thermal energy scale of \(k_{\mathrm{B}}T \sim ~\)26 meV, and readily able to excite vibrations or drive electronic excitations.  However, one electron spin in a reasonably accessible magnetic field of 1 Tesla is \(g \mu_{\mathrm{B}}B \sim ~\) 0.1 meV.)  Here is a nice survey article about the constraints on how magnetogenetics could operate.
  • For a tutorial in how not to handle academic promotion cases, see here.  

Tuesday, June 06, 2017

Follow-up: More 5nm/7nm/10nm transistors

Two years ago I wrote this post about IBM's announcement of "7 nm transistors", and it still gets many pageviews every week.   So, what's been going on since then?

I encourage you to read this article from Semiconductor Engineering - it's a nice, in-depth look at what the major manufacturers are doing to get very small transistors actually to market.  It also explains what those numerical designations of size really mean, and changes in lithography technology.  (This is one part of my book that I really will have to revise at some point.)

Likewise, here is a nice article about IBM's latest announcement of "5 nm" transistors.  The writing at Ars Technica on this topic is usually of high quality.  The magic words that IBM is now using are "gate all around", which means designing the device geometry so that the gate electrode, the one that actually controls the conduction, affects the channel (where the current flows) from all sides.  In old-school planar transistors, the gate only couples to the channel from one direction.

Later I will write a long, hopefully accessible article about transistors, as this year is the 70th anniversary of the Bell Labs invention that literally reshaped global technology.

Thursday, June 01, 2017

What should everyone know about physics?

A couple of weeks ago, Sean Carroll made an offhand remark (the best kind) on twitter about what every physics major should know.  That prompted a more thoughtful look at our expectations for physics majors by Chad Orzel, with which I broadly agree, as does ZapperZ, who points out that most physics majors don't actually go on to be physics PhDs (and that's fine, btw.)  

Entertaining and thought-provoking as this was, it seems like it's worth having a discussion among practicing physicist popularizers about what we'd like everyone to know about physics.  (For the pedants in the audience, by "everyone" I'm not including very young children, and remember, this is aspirational.  It's what we'd like people to know, not what we actually expect people to know.)  I'm still thinking about this, but here are some basic ingredients that make the list, including some framing topics about science overall.
  • Science is a reason- and logic-based way to look at the natural world; part of science is figuring out "models" (ways of describing the natural world and how it works) that have explanatory (retrodictive) and predictive power.   
  • Basic science is about figuring out how the world works - figuring out the "rules of the game".  Engineering is about using that knowledge to achieve some practical goal.  The line is very blurry; lots of scientists are motivated by and think about eventual applications.
  • Different branches of science deal with different levels of complexity and have their own vocabularies.  When trying to answer a scientific question, it's important to use the appropriate level of complexity and the right vocabulary.  You wouldn't try to describe how a bicycle works by starting with the molecular composition of the grease on the chain....
  • Physics in particular uses mathematics as its language and a tool.  You can develop good intuition for how physics works, but to make quantitative predictions, you need math.
  • Ultimately, observation and experiment are the arbiters of whether a scientific model/theory is right and wrong, scientifically.  "Right" means "agrees with observation/experiment whenever checked", "wrong" means "predicts results at odds with reality".  Usually this means the model/theory needs an additional correction, or has only a limited range of applicability.  (Our commonplace understanding of how bicycles work doesn't do so well at speeds of thousands of miles an hour.  That doesn't mean we don't understand how bikes work at low speeds; it means that additional effects have to be considered at very high speeds.)
  • There are many branches of physics - it's not all particle physics and astrophysics, despite the impression you might get from TV or movies.
  • Physics explains light in all its forms (the microwaves that heat your food; the radio waves that carry your wifi and cell phone traffic; the light you see with your eye and which carries your internet data over fibers; the x-rays that can go through your skin; and the really high energy gamma rays that do not, in fact, turn you into an enormous green ragemonster).  
  • Physics includes not just looking at the tiniest building blocks of matter, but also understanding what happens when those building blocks come together in very large numbers - it can explain that diamond is hard and transparent, how/why water freezes and boils, and how little pieces of silicon in your computer can be used to switch electric current.  Physics provides a foundation for chemistry and biology, but in those fields often it makes much more sense to use chemistry and biology vocabulary and models.  
  • Quantum mechanics can be unintuitive and weird, but that doesn't mean it's magic, and it doesn't mean that everything we don't understand (e.g., consciousness) is deeply connected to quantum physics.  Quantum is often most important at small scales, like at the level of individual atoms and molecules.  That's one reason it can seem weird - your everyday world is much larger.
  • Relativity can also be unintuitive and weird - that's because it's most important at speeds near the speed of light, and again those are far from your everyday experience.   
  • We actually understand a heck of a lot of physics, and that's directly responsible for our enormous technological progress in the last hundred and fifty years.
  • Physicists enjoy being creative and speculative, but good and honest ones are careful to point out when they're hand waving or being fanciful. 
I'll add more to this list over time, but that's a start.... 

Wednesday, May 24, 2017

Hot electrons and a connection to thermoelectricity

The two recent posts about the Seebeck effect and hot electrons give some context so that I can talk about a paper we published last month.

We started out playing around with metal nanowires, and measuring the open-circuit voltage (that is, hook up a volt meter across the device, which nominally doesn't allow current to flow) across those wires as a function of where we illuminated them with a near-IR laser.  Because the metal absorbs some of the light, that laser spot acts like a local heat source (though figuring out the temperature profile requires some modeling of the heat transfer processes).   As mentioned here, particles tend to diffuse from hot locations to cold locations; in an open circuit, a voltage builds up to balance out this tendency, because in the steady state no net current flows in an open circuit; and in a metal, the way electron motion and scattering depend on the energy of the electrons gives you the magnitude and sign of this process.   If the metal is sufficiently nanoscale that boundary scattering matters, you end up with a thermoelectric response that depends on the metal geometry.  The end result is shown in the left portion of the figure.  If you illuminate the center of the metal wire, you measure no net voltage - you shouldn't, because the whole system is symmetric.  The junction where the wire fans out to a bigger pad acts like a thermocouple because of that boundary scattering, and if you illuminate it you get a net thermoelectric voltage (sign depends on how you pick ground and which end you're illuminating).   Bottom line:  Illumination heats the electrons a bit (say a few Kelvin), and you get a thermoelectric voltage because of that, to offset the tendency of the electrons to diffuse due to the temperature gradient.  In this system, the size of the effect is small - microvolts at our illumination conditions.

Now we can take that same nanowire, and break it to make a tunnel junction somewhere in there - a gap between the two electrodes where the electrons are able to "tunnel" across from one side to the other.  When we illuminate the tunnel junction, we now see open-circuit photovoltages that are much larger, and very localized to the gap region.  So, what is going on here?  The physics is related, but not true thermoelectricity (which assumes that it always makes sense to define temperature everywhere).   What we believe is happening is something that was discussed theoretically here, and was reported in molecule-containing junctions here.   As I said when talking about hot electrons, when light gets absorbed, it is possible to kick electrons way up in energy.  Usually that energy gets dissipated by being spread among other electrons very quickly.  However, if hot electrons encounter the tunnel junction before they've lost most of that energy, they have a higher likelihood of getting across the tunnel junction, because quantum tunneling is energy-dependent.  Producing more hot electrons on one side of the junction than the other will drive a tunneling current.  We still have an open circuit, though, so some voltage has to build up so that the net current in the steady state adds up to zero.  Bottom line:  Illumination here can drive a "hot" electron tunneling current, and you get a photovoltage to offset that process.  This isn't strictly a thermoelectric effect because the electrons aren't thermally distributed - it's the short-lived high energy tail that matters most.

It's fun to think about ways to try to better understand and maximize such effects, perhaps for applications in photodetection or other technologies....