Back in July, I wrote about a hormone, oxyntomodulin, that had been isolated which tells the brain that the stomach is full. Pharmaceutical companies are now working on turning this knowledge into an oral diet drug. Now, another compound has surfaced which suppresses the desire to eat and drink. This one has its roots in Scottish medieval times: Augustinian monks have chewed the plant to suppress all urges to eat or drink. The plant, lathyrus linfolius, was processed at a hospital there to make a potion. It is thought that this potion was brewed to help villagers lose weight or cope with the effects of a bad harvest.
Lathyrus linfolius appears to be fairly potent:
Dr Moffat said: “These tiny tubers are eaten two or three at a time. Chewed, they taste of leathery liquorice. Quite simply, according to all the reports we’ve compiled, around 300, people forget to eat and drink. They feel no need to eat and drink and this lasts for weeks, sometimes into months.”
The monks were from a 12th century monastery at Soutra Isle. The apparently helped at, or perhaps ran an area hospital (the article is not clear) which was one of the most important medical centers in Scotland for quite some time.
If this plant is ever going to be turned into a viable drug, it will have to be regulated, despite its “natural” origins which would lead one to think that it would be in the over-the-counter, herbal supplement section of your local pharmacy or health food store. The potential for abuse by those with eating disorders is too great to be available without a prescription.
This is sort of off-beat for polyscience.org, but I thought the social ramifications were worth investigating. In the UK, the British government has instituted a trust fund program for children (or their parents) to save money in. Currently the only restriction on this is the amount that can be saved per year: £1,200. The government gives parents a £250 voucher to parents to be saved for their child’s future. The funds can be saved, tax-free until the child’s 18th birthday.
The program seems so mind-bogglingly obvious that I’m surprised that no one had thought of it before now. I know many parents in the United States buy their children savings bonds for their first couple of birthdays, and I think such a program here in the US would be immensely popular, especially with grandparents. This new UK system is a bit more forward-thinking, I think. I would have used such a system when I was earning money as an early teenager. It was a game to see how much money I could save in my savings account (passbook savings accounts represent!), and a program like this would have been really cool. I could see kids competing to see who could rack up the most money. (Those with the givingest families would win, of course.)
The articles I’ve been able to find on the topic don’t go into much detail — there doesn’t seem to be any information on interest rates (if any) or what happens when the child turns 18: does he or she have to pay taxes on it? Is withdrawal mandatory?
Another issue is that a government system like this is also ripe for abuse. Witness Social Security in the US. Presumably, the government tracks how much money each child has accumulated, and will dole out what they owe on the child’s birthday. So what happens to the money in the meantime? Does the government use it for something else? The whole thing, while socially-progressive seems like a good way to give the government a nice fat loan.
I’m always skeptical whenever I read something about “nanotechnology.” The actual definition of “nanotechnology” is
the science and technology of building electronic circuits and devices from single atoms and molecules.
So when I read headlines like “Nanocoating could eliminate foggy windows and lenses” I immediately become skeptical, even when it’s an institution like MIT turning out the press release. The word “nanotechnology” is some sort of magnet for public attention, particularly among the geek crowd that is usually misused, often by people that should know better.
Anyway my mini-rant aside, researchers at MIT have developed a coating of “silica nanoparticles.” It’s not nanotechnology (save that it’s the right molecular size), but it is a fancy coating that will prevent windows from fogging. What’s special about this new coating is that it doesn’t have the shortcomings that other fogging solutions have: it doesn’t require UV light to function, nor does its effectiveness decrease over time. The approach to the fogging problem is pretty unique:
When fogging occurs, thousands of tiny water droplets condense on glass and other surfaces. The droplets scatter light in random patterns, causing the surfaces to become translucent or foggy. This often occurs when a cold surface suddenly comes into contact with warm, moist air.
The new coating prevents this process from occurring, primarily through its super-hydrophilic, or water-loving, nature, Rubner says. The nanoparticles in the coating strongly attract the water droplets and force them to form much smaller contact angles with the surface. As a result, the droplets flatten and merge into a uniform, transparent sheet rather than forming countless individual light-scattering spheres. “The coating basically causes water that hits the surfaces to develop a sustained sheeting effect, and that prevents fogging,” Rubner says.
I think drivers everywhere would love something like this, provided it doesn’t cost an arm and a leg. (I know I would.) Two auto manufacturers are already interested in the technology, as is the US military.
Update: Kudos to New Scientist for not using any form of the word “nanotechnology” in their coverage.
Don’t you just hate it when your laptop or cell phone’s battery runs out and you have to scramble to find a wall outlet to save your work or continue your call? Soon, you may no longer have to worry, thanks to the chemical engineers of Purdue University. At the annual meeting of the American Chemical Society today, those fine folks announced that they have developed a means by which the batteries we rely upon so much can be automatically recharged.
In their design, a credit card-sized cartridge containing hydrogen-releasing pellets would drive a fuel cell that provides charge to batteries as they are depleted. The pellets consist of compounds that produce hydrogen when reacted with water; a computer chip can monitor how many of these pellets have been used up in a cartridge and signal when a replacement is needed. Best of all, these cartridges are not only extremely portable but disposable as well; the byproducts of the reactions utilized are environmentally friendly, so used-up cartridges can easily be discarded or recycled.
Besides the obvious application in portable devices for everyone from investment bankers to Marines, the inventors mention that their device may also be considered as safe energy source for hardware in future space vehicles. Evgeny Shafirovich, a scientist who worked on the project, states:
“The Apollo 13 accident was caused by an explosion involving liquid oxygen, which is needed along with liquid hydrogen to feed a fuel cell in spacecraft. Use of chemical mixtures, such as ours, for generation of hydrogen and oxygen would eliminate the possibility of such an explosion.”
And, of course, any technology than can spare NASA further problems is a good thing.
The first time I ever watched the movie Gladiator, I was confused by the scene where Maximus is captured by the slavers. The scene shows his arm where he had been slashed covered in what looked like pieces of rice. I was doubly confused when his friend says “No. They will clean it. Wait and see.” How could pieces of rice clean a wound? Well, I learned later that they were maggots. (Hey, give me a break, I was in high school.) A lot of people are repulsed by this scene, and I was no exception. This revulsion is compounded by internet rumors of maggots eating brains while a person is still alive. (That link is extremely gross; you’ve been warned.)
The truth of the matter is that maggots are useful little creatures, and the only flesh they consume is already dead. Since maggots only consume dead flesh, it makes sense that they would be used to clean especially nasty wounds. Often synonymous with death, maggots can breath fresh life into limbs once thought lost. It is basically only in modern times that maggots have gone out of vogue in the medical community for more modern alternatives like antibiotics.
Well now they’re making a comeback along with medicinal leeches. Maggots can clean particularly nasty wounds better than almost anything else out there. They can prevent infection and allow new tissue to grow in place of the old.
As for maggots, they are unparalleled in their ability to clean festering, gangrenous wounds. For diabetics and others whose wounds fail to heal, maggots, pressed into dying flesh by wire-mesh bandages, can save a limb and speed healing.
The modern champion of the using maggots as a tool to heal wounds is a World War I doctor named William Baer who once saw two soldiers left wounded on the battlefield for days without care. When their clothes were removed, thousands of maggots were found in their wounds, but once the maggots were cleaned out, Dr. Baer discovered clean, pink, new flesh growing.
Leeches are also staging a comeback for the unique properties that they bring to recovery rooms. Leeches secrete a few different things, all of which are useful when re-attaching severed appendages:
Leeches naturally inject patients with a potent chemical cocktail that includes an anticoagulant, an anesthetic, an antibiotic and a substance that dilates blood vessels. This cocktail encourages fast bleeding to empty the appendage of extra blood, reducing pressure and allowing veins to form on their own.
In 20 minutes, a leech is usually engorged and removed, though bleeding from the wound may continue for up to 24 hours. If an appendage is large, several leeches are sometimes used at once, Dr. Levin said, adding, “I’ll use one to three leeches every couple of hours.”
“It won’t attach if there’s not good arterial blood coming in, and sometimes that tells me that I need to go back in,” Dr. Minkin said.
For some reason I can just imagine a prescription being issued for fresh leeches:
apply 1 leech q3° prn ud
Haha! Anyway, it’s nice to see some very useful age-old remedies making a comeback. One can hope that the societal hang-ups associated with such useful creatures will eventually be a thing of the past.
Back in the 1847, the American Medical Association (the AMA) was created by Dr. Nathan Smith Davis in an attempt to elevate the practice of medicine. At this time, medicine was dominated by two major schools of thought: the allopaths and the homeopaths. Dr. Davis was an allopath. It has been argued that the AMA was created to, among other things, discredit the homeopathic school of medicine. In this respect it was highly successful given that if you go to a doctor today, you aren’t given homeopathic remedies, you are given allopathic treatments.
Allopathy, in a nutshell, is the practice of combating illness with opposites. If you have an infection, you are given an antibiotic. If you have cancer, you receive radiation or chemotherapy to kill the cancerous cells. Largely, these treatments work: people have longer life spans today than they did 100 or 200 years ago. We’ve come a long way in the study of medicine, but there are some things that medicine simply has no answer for. Or if there is an answer, it is not an optimal way of dealing with a medical condition. Witness things like oral steroids like prednisone and Medrol. Long-term use of these types of medications is not healthy, and often the side effects can be worse than the condition they’re being used to treat. The same is true of the myriad irritable bowel diseases like Crohn’s disease, various forms of colitis, etc: these diseases don’t have treatments that are especially effective without doing drastic things like suppressing the immune system or removing parts of the intestines.
For a long time, there has been an “alternative” movement in medicine. It has become popular in recent years, and the fact is, this school of thought has been around for quite a long time. These natural remedies are often looked down on by those in the medical profession, especially doctors. I think this is a mistake, for a number of reasons that I will expound upon in a moment. First, I want to explain the principle behind homeopathy.
Homeopathy essentially amounts to taking a substance that would normally cause irritation in the body, and diluting it many times over. The idea is that by diluting the substance and then introducing this diluted solution into the body, the body will be able to effectively combat whatever illness or irritant is ailing you. Basically, fighting like with like. This is the exact opposite of the traditional allopathic medicine that is mainstream today.
One of the allopaths’ arguments against homeopathy is that the substance is diluted so much, all that’s left is water. Contrary to “real” science, water molecules cannot “absorb” the properties of an irritant and still be water. It would be a water + something else solution. But all that’s present is this water, there is nothing else. So what you’ve got from a chemical point of view is water being used to treat everything from poison ivy to stomach cancer.
I stated in the very first post here on polyscience.org that I was a pharmacy student. As such, most of my schooling is done on the allopathic side of things. We study the drugs that are used today. Slowly, though, the curriculum is changing; we’re learning about herbs and probiotics and even homeopathy. In general, I prefer a good allopathic remedy that I know is going to work, but I hesitate to question the efficacy of alternative therapies for many reasons. Perhaps the most blindingly obvious one is that doctors, pharmacists, and even the drug companies simply don’t know why some drugs work. This is usually due to an incomplete understanding of the specific physiology underlying the body process that we’re trying to alter. It would seem reasonable to me, then, to not dismiss alternative medicines and therapies out of hand. One of these is homeopathy, despite it not making “scientific” sense. Relegated to the ignoble position of “placebo,” subject to ridicule from layperson and medical professional alike. But something being deemed “placebo” isn’t necessarily a bad thing. Any medical professional worth his salt will tell you that getting better is often simply a state of mind. “Simply,” of course, being anything but simple. Twenty years ago, it was not uncommon for a doctor to literally write a prescription for a placebo. The patient would then take this prescription to the pharmacy where it would be filled. The capsules dispensed were usually filled with lactose — a filling agent used to fill the extra space in a capsule for a custom-compounded medication. And the patients were happy, and usually got better. Not all the time, of course, but most of the time.
This brings me to the catalyst for this post. The Lancet is probably the most prestigious medical journal in the world, and recently they’ve once again shot down the benefits of homeopathic therapy. They might be right,I don’t know; it’s impossible to know, and the researchers have even acknowledged this.
Professor Egger said: “We acknowledge to prove a negative is impossible.
“But good large studies of homeopathy do not show a difference between the placebo and the homeopathic remedy, whereas in the case of conventional medicines you still see an effect.”
What I do know, however, is that given our incomplete understanding of the human body, dismissing all alternative therapies out of hand is a mistake, and I think that over the next 100 years, we allopaths will discover this. I’m not saying that homeopathy is effective or if it’s “just” the placebo effect, but I do think that we should continue to investigate “alternative” therapies — homeopathy included, even if it’s just to disprove their efficacy. Disproving something definitively is never a waste of time or money.
I read reading New Scientist, and I found something new and cool to learn about: DNA reassociation.
It’s a technique for determining the number of different types of organisms in a given amount of material. What happens is that a chemical is added which causes the DNA double helix to “unzip,” leaving single strands of DNA. The single strands are mixed up with many other single strands of DNA, and the amount of time that it takes for the strands to find a match yields a rough estimate of how many distinct types of organisms are present in a given sample. The point of the actual article was to say that there are many more species of bacteria undergound than were previously thought. I just thought the DNA reassociation thing was much cooler.
When this technique was applied to soils in the late 1990s, it suggested that a gram of dirt contained about 16,000 species. But this estimate assumed that the populations of all the different species in the soil were roughly equal in size. So Gans and his colleagues have developed new equations to reanalyse the same DNA reassociation data but without this size assumption.
Their results reveal that there are a few very common species in soil but lots of rare species. “There is a very large number of low abundance species,” says Gans. So many rare species, in fact, that the estimate of bacterial biodiversity rises to one million species per gram of soil.
If you do an Internet search for “DNA reassociation” you’ll find lots of academic websites explaining the equations to describe rates of reassociation and how they relate to real world strand diversity in a given sample, but all of them (that I found) make an assumption about the size of the strand involved. I wonder how these new equations will alter the study of DNA reassociation kinetics?