Before, I made a post titled why is foaming difficult. Here, I introduced the idea that creation of an interface always costs energy and that many times the invisible forces at work tries to minimize the surface area for example by taking on spherical shapes (since spheres have the highest volume to surface ratio). But at that time, I did not have a great answer as to why making surface always takes energy.
Recently, I had this new idea. First, where do surfaces exist? Where there is phase separation. Why is there phase separation? Because the two phases are not miscible. Why are they not miscible? Because they like themselves better than each other... Since surface forces the molecules to loose interaction with one another and form interaction with the other phase (which by the virtue of the fact that these are not miscible is less favorable), the surface costs energy.
The Everyday Chemist
Saturday, 14 May 2016
Monday, 4 January 2016
Invisible Gorilla Syndrome
I was listening to some talk. He criticized the nanotechnology claiming it was a field based on ungrounded assumptions. I have previously worked with gold nano particles for diagnostic uses. I also worked with more conventional formulations while working in a pharmaceutical company. Since then I have been wondering why nanotech is so popular. While I see that nanotech papers are everywhere, I have not seen much of it in health care. He addressed some interesting issues.
Most interestingly, however, he mentioned what is called the invisible gorilla syndrome. People are so focused on one aspect that they lose sight of the obvious things. Personally, I think I'm more of a connector than a detail oriented person. I normally like to see connections and the big picture. But, recently, I have been realizing myself being more focused on one thing. I think the nature of research makes people focused. It's difficult to understand the world as is. We divide it into small pieces that we can digest. After deciding on a piece to tackle, we forget about the other pieces.
It made me realize I need to also take a step back in my own research and reflect from time to time. I shouldn't be trapped in my own topic or field. Every now and then, I need to read general literature, at least identifying other pieces and thinking of the big picture.
Most interestingly, however, he mentioned what is called the invisible gorilla syndrome. People are so focused on one aspect that they lose sight of the obvious things. Personally, I think I'm more of a connector than a detail oriented person. I normally like to see connections and the big picture. But, recently, I have been realizing myself being more focused on one thing. I think the nature of research makes people focused. It's difficult to understand the world as is. We divide it into small pieces that we can digest. After deciding on a piece to tackle, we forget about the other pieces.
It made me realize I need to also take a step back in my own research and reflect from time to time. I shouldn't be trapped in my own topic or field. Every now and then, I need to read general literature, at least identifying other pieces and thinking of the big picture.
Saturday, 19 December 2015
Popcorn
Microwave some corn kernels and a bit of magic makes popcorns. When I was a kid, I thought that popcorns can only be made in popcorn bags and that they must have butter for it to work.
I had no idea why microwaving the bag made popcorns but I knew that the sound associated with the bag was indicative of the progression of the popcorn making process.
Now, I realize at the heart of popcorn making is not magic, but science. Actually, all you need is sufficient heat and some popcorn kernels.
Although popcorn kernels seem dry, they are not devoid of moisture. When you heat up the popcorn, the water inside expands significantly. So when the pressure due to water expansion builds up significantly, the kernel breaks. The volume increases and the internal pressure drops to atmospheric. The fluffiness is due to the low density which arises due to this volume expansion.
I think the popping sound is due to the expansion rate being faster than the speed of sound. I'm not sure. It might just be the shell breaking. I guess a high speed camera may be able to provide some answers.
I had no idea why microwaving the bag made popcorns but I knew that the sound associated with the bag was indicative of the progression of the popcorn making process.
Now, I realize at the heart of popcorn making is not magic, but science. Actually, all you need is sufficient heat and some popcorn kernels.
Although popcorn kernels seem dry, they are not devoid of moisture. When you heat up the popcorn, the water inside expands significantly. So when the pressure due to water expansion builds up significantly, the kernel breaks. The volume increases and the internal pressure drops to atmospheric. The fluffiness is due to the low density which arises due to this volume expansion.
I think the popping sound is due to the expansion rate being faster than the speed of sound. I'm not sure. It might just be the shell breaking. I guess a high speed camera may be able to provide some answers.
Saturday, 5 December 2015
scientific terms
Recently, I've been thinking about definition of various things. In science, many things are defined operationally. I've come across some questions about whether glass is liquid or solid. There has been some misconception suggesting that glass is liquid and Derek addresses this (https://www.youtube.com/watch?v=c6wuh0NRG1s&feature=youtu.be&a).
But more importantly, it brings to light some interesting general problem. We have definitions and boundaries. Solids are things where atoms do not flow... but on what time scale? If you wait for a long time, solids can also "flow" (such as the earth's mantle). Are solids and liquids a continuum in actuality?
Like Derek suggests, these definitions and classifications create misconceptions. More troublesome is the fact that it creates a mental block. Is the world really made up of solids and liquids? I think no. These are models introduced by humans. I think by giving them names and using these term routinely, we forget that they are models describing the physical world around us. It is well known that there is a close tie between culture and language. But it seems language really shapes the way we think even in science. There is no perfect observer without bias.
But more importantly, it brings to light some interesting general problem. We have definitions and boundaries. Solids are things where atoms do not flow... but on what time scale? If you wait for a long time, solids can also "flow" (such as the earth's mantle). Are solids and liquids a continuum in actuality?
Like Derek suggests, these definitions and classifications create misconceptions. More troublesome is the fact that it creates a mental block. Is the world really made up of solids and liquids? I think no. These are models introduced by humans. I think by giving them names and using these term routinely, we forget that they are models describing the physical world around us. It is well known that there is a close tie between culture and language. But it seems language really shapes the way we think even in science. There is no perfect observer without bias.
Sunday, 29 November 2015
microwave heating problem
I’m heating
up my food using a microwave. I’m heating some sweet potatoes and I figured
since I mash it with milk later, I can immerse them in milk during the heating.
But, unlike the sweet potato alone which heats up really fast, the milk + sweet
potato doesn’t heat up that fast… That got me thinking… Why?
Here’s my
hypothesis. The increase in temperature is associated with increase in the
AVERAGE thermal energy. In thermodynamics, this is an intensive property (does
not scale with size). As opposed to the average, think about heat which is a
measure of thermal energy. As the total amount increases, assuming the same
temperature, the total heat increases. This is an extensive propery.
Now, let’s
think about how much heat we need to put in to change the temperature. For
example, we know (from experience) that metal heats up really fast. Metal has
low heat capacity compared to say water. (We shouldn’t confuse between heat
capacity and thermal conductivity. Metals have high thermal conductivity which
means the heat propagates faster. It doesn’t say how much energy is needed to
raise its temperature.)
Now, I
think the answer is somewhat simple… The sweet potato has lower heat capacity
and thus needs less energy to heat it. Assuming that the heat input is
constant, this means that the temperature (how I measure “hotness”) is higher
faster. Adding milk is like adding more stuff but also stuff with high heat
capacity… you need more energy to change its temperature by the same amount.
But then
one question remains… is the heat input really constant? I know that the
microwave is operating at the same power (energy per time), but it does not
tell me that the energy is used up as heat… Microwave works by injecting
microwave radiation into the system which should absorb this radiation.
Molecularly, the species that absorbs the radiation enters a higher energy
vibrational mode which is translated as heat. But if you have more radiation
absorbing species, conversion to thermal energy might be greater. I don’t think
this plays a big role in my milk + sweet potato case… but I do wonder for other
food matter… I know that water tends to absorb this radiation or at least a lot of posts online talk about how water absorbs this energy… but I’ll
need to think more about what happens to almost completely dry materials.
Friday, 25 September 2015
Institutional corruption and off-label drug use
Today, I attended a presentation by an invited professor on institutional corruption
and off label drug use. He introduced the problem of misalignment of interest. Public
wants prescriptions that will solve their health issues. The pharmaceutical
industry wants to maximize profits.
In particular he discussed the problem of off-label drug
use. The idea is that the drug that was approved for indication A (indication
is kind of a fancy word for illness) is used for indication B. This is
obviously good for the pharmaceutical industry, because they can sell more.
More importantly, they do not need to conduct very expensive clinical trials.
However, this may not always be in the interest of the patient. It may be in
the interest of the patient. Just because there is no direct scientific
evidence for it, it does not mean that it is not effective.
Kind of interesting in this problem is the role and
interactions of say four independent bodies. First, there is the patient of
course that supposedly knows nothing. Second, there is the doctor prescribing
the medicine. Third, there is the manufacturer/distributor of the medicine
(pharmaceutical companies). Fourth, there is the medicine regulator – the government
agency/law (say FDA).
Suppose the patient was being prescribed something
off-label. This is the doctor’s judgement (and taking these meds are patient’s
judgement, but I assume here that they do as doctors say). The doctors are the final
decision makers in this model. Pharmaceutical companies can influence their
decision by advertisement and I guess some sort of more direct financial way.
By law, they are not able to directly promote their medication for off-label
use. However, they can indirectly promote their medication by providing
evidence from other bodies suggesting efficacy for off-label use. They may for
example fund independent parties to conduct such research.
The FDA has practically no role other than regulating the
initial approval and subsequent off-label advertisements. However, even after
paying these fines, it seems off-label advertisements can pay off. Once
off-label use becomes the norm, marketing is not needed.
Dr Rodwin proposed three things to address this problem:
First, that off-label use should be tracked. Currently, it
is difficult to manage or even understand the true implications of off-label
use (although he did cite some statistics). This can be done by restricting
reimbursements unless the purpose of the drug is provided.
Second, using this data, that if off-label use becomes a
problem, the pharmaceutical industry (by law) conducts clinical trials.
Third, to remove the incentive for promoting off-label use.
Only the on-label use should be earning them that patent premium. Using the
off-label use data, only the manufacturing cost is paid to the pharmaceutical
company and the remaining is taken back.
Overall, the proposal is very interesting. Obviously the
caveat is that it may find a lot of resistance from the pharmaceutical companies.
However, he believes that it is the same as with the implementation of previous
regulations such as full disclosure of ingredients. Apparently in the 1900s, some
new laws that potentially increased cost for the pharmaceutical companies have
been passed. He believes that this seem to be at first difficult, but
eventually become the norm.
Even aside from this caveat, some other profs in the room
seemed to disagree slightly. They argued that the phara industry faces larger
problems. For example, on-label use drugs may not be as safe as expected and
the fact that it is on-label provides false sense of security. Making small
adjustments makes new loop holes and fixes nothing. This was a very interesting
idea, but for now I will hold back on this and talk about my opinion of the
original claim.
I find that the proposal focuses too much on what the pharma
industry should do. He believes that they will do things as long as they can
make money. I like his general approach. He objectifies every entity and
simplifies the system into a game almost. Each has their own roles. In this
game, his role is to propose rules and regulations that will force the other
entities (pharmaceutical industries and others) to act in a way that would
benefit the patients. Although he has not formalized the system into a game
theory setup, I can easily see this happening.
Following this scheme, he seems to think that the pharma companies
have an upper hand over the doctors. His resolution involves only the pharma
companies and not much of the doctors (the first is about the doctors reporting
but he aims to incentivize this behaviour by blocking reimbursements unless the
information is present). I particularly disagree with the second proposal. I do
not understand why the pharmaceutical companies should pay to conduct such
research just because they are making money from it. What if they believe that
the research would result in a negative outcome? The clinical trial would seem
like a complete waste of money (and of course they would also be blocking subsequent
sales from this off-label use but let’s suppose they would rather NOT do
research but stop the doctors from prescribing –which they don’t even have the
ability to do!)
Ethically speaking, I think the doctors should be taking
more precaution. We just need better doctors… not ones that induce babies just
because they want to go home at 5pm -_- (not sure if this is true since I just
read it on reddit). But I think as a policy maker, it is more reasonable to
track the larger small numbered entities (ie. The pharma companies).
Back to the proposal, I think the off-label use incentive
idea is quite good. Companies should be paid premium (patent associated price) for
their on-label drug use only. That will motivate companies to acquire on-label
drug use status for more stuff. It’s not the drug that needs approval. It’s the
drug use. It’s like getting a patent for IDEA. It’s not the substance but the
substance that materializes the idea that matters. The patent should not be for
the drug but for the IDEA that the drug solves some problem.
There are just so many things I would like to change and I
find this interesting. What is theoretically reasonable and good? What can
actually be done about it?
At the end, who holds that power?
Saturday, 19 September 2015
Thermal oscillation cooking
So today I'm having guests over. I'll be serving pork rib rack. I love to read, so while I was reading about the best way to cook this, I found many differing opinions. To make this discussion more "scientific", I will qualify the best meat and limiting the goodness of meat to the tenderness and the moistness of meat as well as the flavor absorption.
So let's think about the meat cooking process more rigorously. First and most obvious I think is the the act of "cooking" and the associated denaturation of proteins. In more general terms, the proteins that make up the meat is deformed and often this leads to a shrinking of meat. Also obvious is the rendering of fat. The fat dissolves and somehow is expelled. There are many other levels to cooking, including the breakdown of collagen.
In our case, we are using heat to cook, so heat is the central driving force. Suppose we place the meat in the oven. What happens? Well, the meat heats up and the meat is cooked. Easy? Not so fast. The heat doesn't heat up the meat instantaneously. The heat must propagate to the center to cook everything. That's why the size of the meat is important. larger meat -> longer cooking time.
Similarly, the flavor absorption doesn't happen all at once. The flavor must diffuse into the meat. Thicker meat -> more time needed for similar flavor absorption. The absorption is faster for more porous material right? Poke holes. That works, but one possible problem is that the juices can also escape through these holes faster.
So let's talk about moistness. Moist meat is so because it has high water content. To make this possible, two things must happen. First, there should be sufficient moisture around. Second, the meat stuff must be able to hold on to moisture. The second part is often emphasized in cooking forums. Over-cooked meat cannot hold as must moisture (apparently). It makes some sense when we think about how moisture might be retained -between the flesh of the meat. If the meat contracts, this space is reduced.
Just a fun thought:
What if I oscillate the temperature so that I'm cooking at high temp and then low temp. If I choose some perfect frequency (or frequencies as a function of time), I think the cooking should be more even throughout since the diffusion would match the injection rate.
Or another possibility is to increase flavor absorption by expanding and then contracting. as the meat expands back, the flavors surrounding it will be sucked in.
Just a thought. I'm not really sure how reversible meat contraction is. I would guess that it is somewhat reversible since the purpose of meat resting is so that the juice will be retained when cutting. I venture to guess that the relaxation of the meat (reversible contraction) promotes moisture retention by increasing space between the flesh.
So let's think about the meat cooking process more rigorously. First and most obvious I think is the the act of "cooking" and the associated denaturation of proteins. In more general terms, the proteins that make up the meat is deformed and often this leads to a shrinking of meat. Also obvious is the rendering of fat. The fat dissolves and somehow is expelled. There are many other levels to cooking, including the breakdown of collagen.
In our case, we are using heat to cook, so heat is the central driving force. Suppose we place the meat in the oven. What happens? Well, the meat heats up and the meat is cooked. Easy? Not so fast. The heat doesn't heat up the meat instantaneously. The heat must propagate to the center to cook everything. That's why the size of the meat is important. larger meat -> longer cooking time.
Similarly, the flavor absorption doesn't happen all at once. The flavor must diffuse into the meat. Thicker meat -> more time needed for similar flavor absorption. The absorption is faster for more porous material right? Poke holes. That works, but one possible problem is that the juices can also escape through these holes faster.
So let's talk about moistness. Moist meat is so because it has high water content. To make this possible, two things must happen. First, there should be sufficient moisture around. Second, the meat stuff must be able to hold on to moisture. The second part is often emphasized in cooking forums. Over-cooked meat cannot hold as must moisture (apparently). It makes some sense when we think about how moisture might be retained -between the flesh of the meat. If the meat contracts, this space is reduced.
Just a fun thought:
What if I oscillate the temperature so that I'm cooking at high temp and then low temp. If I choose some perfect frequency (or frequencies as a function of time), I think the cooking should be more even throughout since the diffusion would match the injection rate.
Or another possibility is to increase flavor absorption by expanding and then contracting. as the meat expands back, the flavors surrounding it will be sucked in.
Just a thought. I'm not really sure how reversible meat contraction is. I would guess that it is somewhat reversible since the purpose of meat resting is so that the juice will be retained when cutting. I venture to guess that the relaxation of the meat (reversible contraction) promotes moisture retention by increasing space between the flesh.
Subscribe to:
Comments (Atom)