Manic Fringe

Luc

The personal blog of the one and only Luc Gendrot. Internet Superhero. Not really.

Powered by Obtvse

An impassioned plea to nobody

Those great thinkers...such tricksters Photo by Hark A Vagrant (click it, you'll be glad)

"I would sweep the floors and scrape all the gum off the desks if you'd just let me into your research lab!"

I considered sending this in an email to a professor, asking about what lab positions were available. I decided against it.

I keep telling myself that I need to just focus on school and learn more, and that I should continue trying to learn outside of class, and somehow people will know I’m bright, knowledgeable, and beyond passionate about science.

"Keep abreast of your field!"

Another thing I say to myself, only half knowing what I even mean, as if it will somehow bring the job offers flying in. (spoiler: It doesn't) And yet I keep on reading: papers, press releases, science blogs and hyped up news articles. Maybe I'm hoping the sheer brilliance emanating from my illuminated brain will attract potential lab positions.

Desperate might be a word I’d use, yes.

This blog, I suppose, is my way to try and differentiate myself from my peers. I like to write, and I love biology, science, and (recently) learning to program. A hipster science blog just seemed a natural progression of those things.

The need to differentiate myself stems from the competitive nature of most universities, it’s not enough to attend and do well at the great school you got into, you have to prove that you can kiss ass, memorize the most, and step on as many backs as possible to get that A+. Not only do I hate this model, I’m also convinced that I’m more intelligent than a lot of my peers despite this model. Too confident you say? Probably. Warranted? Sometimes. You should see some of the kids at my school.

Actually, let me rephrase that altogether. I think I'm "more abreast of my field" than my peers.

Oh sure, some of my bio brethren probably have their lives planned out really well, they know which medical school they're going to, who they'd prefer to marry, what car they’ll own in which town, where they’ll retire, and anything in between here and their end goal is just inconsequential. Undergraduate school is just a stepping stone along the way for many, a tiny cobblestone ready to be stepped on and whisked past. But my own reasons for wanting to study biology are different, they're more amorphous, more ephemeral, and more grounded in the genuine fascination I have in genetics and cell biology, and the sincere wish I have to learn more, to become an expert in my field, and to share my experiences and expertise with the world. Perhaps my reasons are more grounded in selfishness as well, I have yet to determine that.

Growing up I was inundated with images of the great thinkers, innovators, and researchers who have shaped the world around them with their love and appreciation for their area of study. Nikola Tesla, Benjamin Franklin, Rosalind Franklin, Watson, Crick, Koch, Lamark, Mendel, the list goes on. The images of these incredible minds are so romanticized-history having distilled them to only their most admirable of traits-that I've grown up thinking that the pursuit of knowledge is a fundamental right, something that I could just...do. I always thought that my quest to conquer biology would be effortless. The image in my mind that represented my quest for learning was one of me sitting in a blank white room, a cup of coffee on the desk in front of me, silently steaming alongside the dozens of academic research papers, reference documents, and of course my own notes strewn everywhere. That image is, frankly, all I want in life.

I guess I made the mistake that every upper middle class white kid must make. I assumed that everything would be handed to me, no questions asked, as it has been for my entire adolescent life.

I assumed that my noble pursuit of knowledge would lead me down the correct path, and in my blindness mistakenly thought that path would be free of brambles. Nothing in my adolescence prepared me for the simple fact that I have to pay the bills somehow, and reading that latest paper on gut flora composition and Alzheimer's risk isn't going to do that.

The romantic in me rails against the unfairness of a system that would allow for someone in whom the the fires of scientific inquiry burn so brightly to fall by the wayside as I feel I have. But then the pragmatist in me chimes in (sounding a bit too much like my father).

“Stay the course! If you want to learn as badly as you say you do, then stop whining and work within the system! Do what it takes!"

And I know that my inner pragmatist is right, as he often is (much to my chagrin). I know that I can sit in my room typing furiously to write up this blog post, hoping that my sympathetic wailing to the faceless on the internet, like the work of those great thinkers before me, can shape the world around me.

I also know that to do that would extinguish any chance I have left of actually pursuing my interests.

God dammit growing up sucks.

Higher Education is in trouble, as more and more teachers hired are part-time only.

I'm funny

I recently started reading a book called Academically Adrift by Richard Arum . It details the rise of the university as a privately, and monetarily, driven enterprise whose focus on teaching undergraduate students appropriately has been upended dramatically.

What's more, the faculty landscape at universities across the country has shifted dramatically, and now more than 50% of teachers at 4 and 2 year universities are adjunct professors, also called part time professors. These adjunct professors often have many of the same responsibilities and duties as a full time professor, but they get paid less. Often adjunct professors are teaching at multiple universities at a time, just to make a living wage. This is but one of the many reasons Arum believes the higher education system is abysmally failing its acolytes.

According to Arum, the monolithic entity that is the US higher education system is lent intellectual and moral legitimacy because of the money it's able to accrue for itself and its investors and partners, and because of the research knowledge it brings in. In terms of effectively imparting critical thinking skills or a level of preparedness for joining the workforce, though, it falls short.

As a college student who is prepared to graduate soon, moreover as a frightened college student with no concrete plans, no "real" research experience beyond lab courses and my own independent studies, and hardly any idea of what I should be doing to secure my future, Arum's book and the things he brings to light in it definitely hit home. Indeed from what I've read so far Academically Adrift should hit home for many current college students, because what Arum describes is exactly what students have been experiencing first-hand for at least a decade now.

It's beyond frustrating to know that the thing you've been dreaming about for practically your entire semi-adult life (graduating with a degree in biology), is a goal that has gone from noble to run of the mill. More frustrating is to know that the institution that you look up to to aid in your educational growth is more concerned with the research dollars professors can bring in than breeding a sense of critical intelligence or appreciation for learning.

Boston Twitter Awesomeness Part II

Literally right after what I blogged about the other day, the boston drama continued later on into the night.

The news itself is interesting enough: 2 Suspects, one suspect shot and killed in a shootout, the other suspect apprehended the next day, and a civil rights debate ensuing across the 'blogosphere'. Miranda rights, explosives, shootouts, and most importantly (to me) a massive social media population following the events in real time, providing commentary. All my news-fix needs rolled into one very "futuristic" experience. It reminded me how awesome technology is (again).

My roommates and I stayed up a fair portion of the night, trying to follow the events as closely as we could on twitter and reddit. Anybody else following on twitter and listening to the police scanners must have felt, as I did, that they were practically there behind the police partitions. Seeing pictures of the scene all across twitter from reporters and citizen journalists on site and reading reports from dozens of nearby residents put the entire world right in the thick of things.

Not to beat a dead horse any more than I should, but that's pretty freakin' sweet! I'm sitting here in sunny California (at 2:00AM) and I'm getting on-the-fly reporting from dozens, potentially hundreds, of sources all the way across the country, and all of that in real time.

The nature of that on the fly reporting sparked a bit of debate among the social media savvy and who I presume to be the journalism traditionalists. I think these debates are a bit tiring probably because a great deal of what I saw on my own twitter feed was people saying not to trust anything, and on reddit countless people reminding me that I should not trust what's being said on reddit. In my own opinion anyone going to twitter should have the common sense to know that it's very likely they run across some bullshit looking for real information.

I think people just look for reasons to distrust social media as a news outlet, because they fear it's potential power! (MUAHAHAHA!...no? nobody?...Okay.)

Reddit casts its gaze upon the Boston Marathon suspects

Reddit responds to FBI's release of the two suspect photos

I’ve written before about reddit’s potential for enacting change on a widespread scale. And now to see how reddit can help in administering justice to those who deserve it.

In light of the Boston Bombings this past week the internet has been abuzz with eyewitness videos, news reports, unwarranted speculation, and especially in the case of Reddit, a helping attitude.

In at least one thread on reddit--concerning the FBI’s release of photos showing the suspects--users who were at the marathon on April 15th were encouraged to send their photos in to the FBI to help identify the perpetrators. People posted dozens of photos of the FBI’s suspects, offered ideas about who they were, and collaborated on enhancing photos.

It doesn’t happen as often as it should, but displays like this show our species potential for human decency and our collective concern for one another Other users made fun, but honestly I really enjoy seeing people trying to do their part together, it's uplifting and it's heartwarming and it points to a greater potential than maybe was realized in this instance.

I know that it’s likely that, in the scheme of things, the users of reddit may not have done all that much for the FBI in this case. I think what impresses me the most about the information reddit drummed up is the gall with which the users of reddit went about collecting it. What an ambitious goal for a ragtag collection of internet lurkers looking for an information or entertainment fix to set out to accomplish. Imagine what it would mean if any of the pictures, videos, or accounts from people on reddit were the linchpin in the FBI's investigation? I know citizen journalism is fairly common, encouraged even, especially in our ubiquitously connected age but there aren't that many places for thousands of potential citizen journalists to come together and collaborate on a huge story like this. Reddit has shown the capacity for facilitating a meeting of minds like that, and that's really cool (have I said that enough yet?)

Consider, though, the potential good that a website that allows this sort of community organizing stands to offer the world. It’s interesting to think about, and I’m always looking forward to seeing how people use the technology at their disposal to help those around them.

Slacktivism and how to do it effectively

I'm funny

Slacktivism is probably the worst byproduct of our constantly connected internet obsessed culture (that I participate in with fervor). People post things like a picture of a youth with prosthetics running across a finish line, and think that they’ve done the world a mound of good for “raising awareness”. People tweet and facebook their messages against rape, and homophobia, posting hashtags like #PrayforBoston when things like today’s explosions during the boston marathon happen. Slacktivism is the laziest form of activism, and I'm including waterobics.

Now obviously in our fast paced business culture (whatever that means) people can’t be expected to be out marching on the streets every night for a cause, and so they look for ways to help from the comfort of their own home. People who want to can still do tangible good without leaving their house by donating money to causes they support. And really, considering the times, there's nothing wrong with that.

But are there solutions for those of us who maybe don’t have enough money to donate to our favorite causes? Is there recourse for we that don’t have the time to volunteer, or perhaps for those of us that (ever being socially conscious) want to avoid being that person posting incessantly and uselessly about feline irritable bowel syndrome on facebook, but still genuinely want to make a difference?

Of course there is you ninny. It’s 2013.

Enter “The world community grid”, one effort out of many similar efforts to use ‘grid computing’ to solve some of the world’s biggest issues: cancer, predicting protein structure and folding, global warming, malaria, leishmaniasis, and a slew of other research endeavors that require high amounts of computing time and power.

Grid computing (a form of distributed computing) is defined rather aptly by the WCG website:

Grid computing joins together many individual computers, creating a large system with massive computational power that far surpasses the power of a handful of supercomputers. Because the work is split into small pieces that can be processed simultaneously, research time is reduced from years to months. The technology is also more cost-effective, enabling better use of critical funds.

In other words the WCG is a technology developed by IBM that uses the spare system resources of your idling computer to perform computational tasks given to it by researchers with the WCG. You simply install a small applet on your desktop, and leave it running in the background. When your computer isn’t doing anything CPU or GPU intensive, it will start allocating its extra resources to aiding in the furtherance of science!

Now, if you’re like me and you have an active and fulfilling social life (hear that mom?), then you'll often find that your computer sits at home all day feeling neglected. Why not put it to work and help find solutions to a very deadly bacterial disease?

As for those of you shut ins who actually USE your computers (I’m looking at you, the mirror) you can set the program to quietly run in the background, and use a minimal amount of system resources so that you can still aid science and browse tumblr and facebook effectively you recluse ಠ_ಠ.

Find a full explanation of how the world community grid works here

Rat Brain technologies push brain interfaces leagues forward

Human-rat "telepathy" in action

I have a--probably unhealthy--obsession with superpowers of the mind. Characters like Professor X and Jean Grey from X-men are most often my favorites. That’s why I get so excited when I see breakthroughs like the following ones involving rat brains from the last few months:

Scientists move a live mouse’s tail using only their brain

Scientists have tentatively bridged the gap between the brains of other species and our own in this new interface that allows them to cause a rat’s tail to move involuntarily. The human is hooked up to an EEG to measure their brain activity, and a device that utilizes focused ultrasound allowing researchers to stimulate the motor cortex of the rat directly and noninvasively.

So while this might more accurately be called human-computer-rat telepathy, it still seems to herald a promising new age of neural technologies. Which is cool. Ever being the technolgical optimist I can totally envision a future where predecessor technologies allow you to use your Google Glass mounted NeuroConnect™ module to turn on your roomba (or control your personal robot assistant, but I’m not that optimistic) as soon as you think about it on your drive home from work.

Maybe I should write science fiction...

Rats help each other to solve puzzles telepathically

This one is even cooler for me because I also have a penchant for stories that involve hive-minds.

Neuroscientists at Duke university managed to connect two rats together using a BCBI (Brain Computer Brain interface). The interface measures neuronal stimulation as an “encoder” rat solves a puzzle wherein it has to press a lever, as it solves that puzzle correctly, the rat is rewarded and the signal recorded was sent to either the “decoder” rat’s sensory cortex or its motor cortex (in two separate trial experiments).

Astonishingly the transfer of information between the two rats, mediated by the BCBI, allowed the decoder rats to perform previously unencountered problems and solve them about 67% of the time. Enough to be statistically relevant, and enough to make this humble biology student say “HOLY CRAP THATS COOL.”

I must have read it somewhere, but I remember reading that they’re working on connecting more than two mice together. Again always the optimist but I imagine a world where people can perhaps temporarily join “neural nets” of people all connected to one another with this technology, using their collective pool of knowledge and experience to solve complex heuristic problems. Imagine if we coupled this technology with a more advanced form of the non-invasive focused ultrasound we mentioned above? There are a lot of potential applications for technology like this.

Obviously the complexities still need to be fleshed out before this becomes anywhere near commercial. Between developing more precise data collection techniques, and determining how to connect larger areas of the brain using larger arrays of microelectrodes, and perhaps perfecting non-invasive techniques for neuronal stimulation, among a slew of other hurdles. This sort of technology seeing the consumer is probably quite a ways off, but as with all things technological we stand to be pleasantly surprised by human innovation, if Moore’s law stays true to its conclusions.

References

Duke University Medical Center (2013, February 28). Brain-to-brain interface allows transmission of tactile and motor information between rats. ScienceDaily. Retrieved April 14, 2013, from http://www.sciencedaily.com­ /releases/2013/02/130228093823.htm

Yoo S-S, Kim H, Filandrianos E, Taghados SJ, Park S (2013) Non-Invasive Brain-to-Brain Interface (BBI): Establishing Functional Links between Two Brains. PLoS ONE 8(4): e60410. doi:10.1371/journal.pone.0060410

Oh EA you've done it again.

Well they've done it again this year. EA games was voted by The Consumerist readers as the worst company in America, beating Bank of America with a score of 77% over BofA's 23%. That's right, EA Games beat out America's most reviled group of bankers and business-people for the title of worst company. My question becomes "How could a games company beat out some of the most stereotypically hated people in the world for such a...prestigious title?"

I don't know the answer to my question but it seems to say something about consumerism in our country, doesn't it?

I vaguely remember last year when EA first won "worst company in america" because of it's supposedly awful support care, its tendency to release games that the playerbase deemed "unpolished", and more recently its CEO's really shitty apology letter for being so shitty.

Well they've done it again this year and based on that apology letter I'd say they won't do much better next year either. Which is fine by me because I don't play that many EA games nowadays. It does make me a little bit bitter, though, as EA was one of my most beloved childhood game providers. A really fair portion of my favorite titles growing up came from EA, Including several of the Harry Potter franchise games, the Sims, and LotR: RotK for windows. 12 year old me (and current me) thinks those games were(are) the shit.

Hearing "challenge everything" from the video clip above sent shivers down my spine, memories hearing it on Saturday mornings after waking up at the ass crack of dawn to play my play station for 20 straight hours. Not much has changed besides EA. It really is a shame.

And now enjoy this video of an old woman playing Battlefield 3 like a true 15 year old. It was suggested to me on the YouTube sidebar while I looked for the first video. Too precious not to share. She's so adorable!

Note: Der Totschlager means "The homicide" in case you're too lazy to look it up.

Oh my god she has more videos...Excuse me while I waste my day.

New advances in Memory and Processing Power

Oh look some alt-text

Okay. Let's reveal some of my ignorance here really quickly.

I have absolutely no background in computer science besides using a computer my entire adolescent and adult life, and living with 4 computer scientists. That being said I have no idea what I'm talking about here.

So a while ago I read this article on Gizmodo entitled "Scientists Have Made the First Truly 3D Microchip" and besides scoffing at the fact that I just read a Gizmodo article, I also filed this little tidbit of information away under my hat so I could bring it up later with my computer scientist friends (see above) and seem smart.

And then today I read this article about some very promising magnetism research that could aid in the production of better RAM.

Well the gears took a while to turn because I have no idea what any of this is, and I'm certain this is a no-brainer to someone with a modicum more knowledge of the subject than I, but I thought a marriage between these breakthroughs should be a thing.

And then I started thinking about Moore's law and how this is just more proof that it's trudging along steadily and accurately.

And now this stream of consciousness post is over with. Thank you for joining us, we know you have a choice in short uninformative blog posts, and we're glad you came to Manic Fringe. Enjoy your day.

Immunotherapy uses special stem cells to improve the body's defenses.

T Cell Micrograph

Researchers at the University of Tokyo have developed and published a technique that could be a pretty substantial leap forward in cancer therapy, if they move it to clinical trials.

The japanese researchers developed a technique to induce CD8+ T Cells (CTLs for short, they’re also known as killer T cells), an immunological cell produced by the body, to revert back into their initial state, one of being a stem cell. They then induced these same stem cells to become killer T cells again. These new killer T cells were found to be “rejuvenated”, they were able to effectively mount an immunological response, and unlike their original counterparts they have a much longer effective lifespan.

What are T Cells?

CTLs, or cytotoxic T lymphocytes, are cells produced by the body that are responsible for destroying “sick” tissue cells in the body. These tissue cells might be infected with a replicating virus, or an invading bacteria, or they might be cancer cells. T cells become “activated” when they encounter these sickly cells, and a mercy killing begins and ends with apoptosis of the target cell. After activation T cells grow and divide rapidly, and all subsequent daughter cells can recognize and respond to the same types of sickly cells that the originally activated cell did. An individual may have hundreds (maybe thousands?) of entire T cell populations in them, each population responding to a different, but distinct, disease marker called an Antigen.

“Rejuvenated Cells” appear younger

T cells can make mistakes, however, and can end up killing normal healthy cells. After a while the body has ways of causing the killer T cells to lose their potency, so as to mitigate major damage to healthy cells. The “rejuvenated” cells that the researchers at the University of Tokyo created are not induced to lose their potency as quickly as they would have been without rejuvenation, resulting in a longer and more immunologically active lifespan.

The answer to this elongated lifespan? The rejuvenated cells were found to have longer telomeres, protective portions of DNA present at the end of each chromosome that are shortened naturally each time a cell differentiates due to unavoidable error, they may be thought of as the aglets of your DNA. Studies have shown that longer telomeres corresponds to longer lifespans not just for T cell populations, but potentially for whole organisms.

Cancer Therapy Potential

This holds great potential as a means of fighting cancers. One could foresee a future where large quantities of CTLs are grown from your own cells, specific for a slew of antigenic determinants on the surface of the cancerous cells.

Clinical trials using other techniques have already shown that the induction of more potent CTLs can be an effective treatment for a variety of cancers, but often the body is so “used” to the cancer cells that their CTLs have been sensitized (sort of) to the presence of the cancer, and are less effective in combating the tumor. So I have a lot of hope for the future of fighting cancer. It’s getting easier, folks.

Math and Biology meet in a technique to better classify cancer genomes

In cancer research and in oncology, it is enormously beneficial to sequence the cancer genome to determine what cellular components are malfunctioning to cause cell proliferation. With a particular cancer’s genome scientists and doctors are able to classify and better diagnose cancer types, as well as determine what anticancer drugs are going to be the most effective in aiding treatment. The technology for this sort of genetic profiling is expensive, however, and many runs are needed to get accurate and specific results.

The ability for scientists to perform these analyses has previously relied on other techniques whose mainstay is that they eliminate error through the sheer force of using a lot of data. The DNA sequencing of entire genomes is inherently error prone, and so to counteract the propensity for false positives the sequence of any particular genome, may be taken several times before a sufficient degree of accuracy is established. This process has, in the past, been time consuming and expensive. However, with the advent of high throughput DNA sequencing, and with the publicly available techniques presented in “MuTect”, the ease of cancer mutation screenings will go up, as the price steadily goes down.

MuTect is a technique developed by a group of researchers at The Broad Institute in Massachusetts, it uses a form of statistical analysis called Bayesian Probability that allows the researchers to more accurately characterize DNA base pair mutations, called SNPs (single nucleotide polymorphisms), with fewer sequencing runs. The method is different from its predecessors in that it takes into account the data from each subsequent run, and becomes more accurate as each run is finished. MuTect has been shown by the authors to be significantly more accurate overall, and at lower sequencing depths (the number of sequencings done) than competing standard methods.

In addition, the authors present evidence that MuTect is able to better differentiate cancer cells that have a lower frequency of mutations in their genomes due to cell differentiation than the competing methods. As cancer cells divide they may gain or lose mutations and after time has passed they may have a different profile of mutations as compared to the original set of cancer cells. MuTect has opened the door to being able to better track and study these “subclones” as they grow, and potentially as they metastasize in cancer patients. MuTect comes out of The Broad Institute, a multidisciplinary institute that started less than 10 years ago, growing out of efforts by members of both Harvard and MIT. The Broad Institute, according to its mission statement, seeks to “act nimbly, work boldly, share openly, and reach globally” in accelerating our understanding and treatment of disease. It was started by philanthropists Eli and Edythe Broad. For more information on MuTect see the Broad Institute’s publication in Computational Biology, and for more information on the Broad Institute make sure to visit their website.

Note: though I wish it were so, the broad institute did not offer me anything to write this, I just think they're neat.