Single use website privacy in academia

It’s grad’ school admissions time again (yay!) which means I’m knee-deep in requests for letters of recommendation. Just as an example, one student has requested letters for 7 different institutions. Writing the actual letters is not a big deal (hey Nick you owe me a beer!) but the websites some Universities use for admissions are a royal P.I.T.A.

One such website (used by a mid-size college in that big metropolis near Cape Cod MA) is “LiasonCAS“.  The premise seems rather innocent and simple at first – the University uses such a site to coordinate the upload of application materials, so nothing gets lost along the way, and presumably this saves a lot of collating work that would have previously been done by a grad’ school administrator. The problem is, there are hundreds of these sites, all commercially operated, and all requiring their own User ID / Password combination.  This creates 2 problems…

Problem 1 – too many passwords

I’m a strong believer in never using the same password twice, hence my reliance on the open source KeePass application, which stores passwords (typically random strings of 20+ characters) in an encrypted file. All told, I have 200+ unique log in IDs. Just to emphasize the size of the problem here, I recently consolidated my logins for Elsevier journals into a single user ID, and it broke the portal set up to do this. It turns out I had reviewed for 37 separate Elsevier journals over the years, and some extensive ‘phone tech support was required to give me a single account to manage them all.

So the problem here is do I really want to register for yet another website, create a unique user ID, have all my contact info’ out there in the cloud?  Am I ever going to use this website again?  Probably not, which means it’ll just sit there waiting to be hacked a decade from now, resulting in a bunch of spam and junk-mail or ‘nuisance phone calls. Over the years, on top of the 200+ login UN/PW combo’s mentioned above, I would guess I’ve accumulated at least that number again in single-use website visits.  This is not good for security.  And no, it’s not as simple as me going to the trouble of making another 200+ unique UN/PW combos for all these sites. I don’t want a KeePass database filled with junk.

Problem 2 – Draconian privacy policies

Like most admissions portals, the one mentioned above requires that you agree to their privacy policy as a condition of signing up for the privilege of submitting materials.  Most people breeze through these things when signing up for a site, which is no big deal if you’re going to use it regularly and it’s a necessity for your job. Apple’s iTunes EULA is notoriously squirrely about privacy issues, but if you wanna use an iPhone, deal with it.

However, in the case of a single-use site, these policies need a bit more scrutiny. What exactly are you agreeing to, for this one-time use?  Here’s the section from LiasonCAS’ policy on what they (the site owner and the University by extension) can do with your information. There’s a bunch of guff about using your contact info’ for contests, surveys and promotions, which is worrying in itself. But then there’s this:

3.10. Other Uses. In addition to the uses specifically identified in this Section 3 (Our Uses of Your Personal Information), we may use Personal Information you submit in any other manner we reasonably deem necessary in order to provide you with the information, products and services you request from us….

Essentially what they’re saying here, is they can do what the hell they like with your data, so long as they can write it off as “necessary” for the service you request.  What you’re requesting of course, is the privilege of uploading stuff for admissions. And the price you pay is them having the freedom to shill your data out to their spammy partners under the guise of necessity.

This is not cool.

So what to do instead? In this case I found the email address of the Dean for the graduate school in question, and emailed them the letter directly. Sure, it took another 5 minutes but at least all my personal contact info isn’t sitting out there on some nondescript company’s website waiting to be sold.

New Year’s Resolution for academics – fight the requirement to create a new User ID / Password for any website that you know you’ll probably never use again.

 

A cluster of chlorination… our latest paper in AJP Heart

Our paper on autophagy is finally out in AJP Heart. It’s essentially the story of a follow up on a high throughput screen we did several years ago, to find molecules that protect cells (and hearts) from ischemia-reperfusion injury.

One of the hits was the molecule cloxyquin, an 8-hydroxyquinoline with striking structural resemblance to another drug, clioquinol. The latter has been proposed to stimulate autophagy, and also causes mitochondrial uncoupling (it’s also known as “chinoform” in the older literature). Notably, Roberta Gottlieb’s group has shown that stimulation of autophagy occurs in ischemic preconditioning, and it’s also known that mito’ uncoupling stimulates autophagy.

So, we hypothesized that the protective effect of cloxyquin might also be due to mito’ uncouping stimulating autophagy.  First we showed that cloxyquin, like its cousin clioquinol, does indeed uncouple mito’s. It also stimulates autophagy in cardiomyocytes (for this we made cells from GFP-LC3 mice). We also showed that cloxyquin is protective in an in-vivo model of IR injury.  Lastly we showed that protection by cloxyquin could be blocked by the autophagy inhibitor cloroquine.

Cloxyquin, clioquinol, chloroquine, what a cluster!

An important thing we learned during this work, is that combining an autophagy activator and inhibitor in-vivo is not a good idea. Apparently this is well known in the field, but not to us…. when you stimulate autophagy and inhibit it at the same time, things go downhill very fast (aka the mouse drops dead).  We learned about this verbally at a conference but I can’t find a reference in the literature for it.  I guess nobody likes to report when experiments don’t go as planned.

Another key thing we learned during the review phase was that drugs can often be had from obscure suppliers for less money.  Specifically, we had argued in early reviews that doing a complex series of experiments using Bafilomycin A1 (another autophagy inhibitor) would be prohibitively expensive, because the drug is $150 for 10 micrograms, and we would have required several thousand dollars’ worth for this experiment.  A reviewer kindly alerted us to the same drug from $198 for 5 milligrams from LC Biosciences. Literally, 500 times more drug for about the same price!  The experiment ended up not working out anyway, so is not lost to the ether (aka figures for reviewers only), but it’s always good when a reviewer offers to help out in this way with practical advice rather than just crapping on your work.

Finally, does this get us any closer to a therapy for MI?  Probably not, because the protection by cloxyquin is prophylactic – we need to guess who’s going to have a heart attack and load them up with drug. Not easy from a clinical perspective. BUT – cloxyquin does appear to be an interesting tool compound.  If you’re looking for something other than rapamycin to stimulate autophagy then give it a try (and no, we haven’t tried to see if it extends lifespan yet)!

Moving forwards, we’ve recently done another screen with a more relevant cell type and adding drugs at the moment of reperfusion (equivalent to delivery during percutaneous coronary intervention in the cardiac catheterization lab’).  This has yielded some very interesting hits which we’ll be reporting on soon.

Kudos here goes to Jimmy Zhang, MD/PhD student in the lab who led this project (and who also BTW just passed his PhD qualifier exam last month). Congratulations Jimmy!

 

Punching down; In defense of PubPeer

Today marked a new low in the depths to which publishing industry will sink to maintain their outdated business model.  I’m talking about a piece of literary detritus entitled “Vigilante Science” published in Plant Physiology and penned by its editor in chief Michael Blatt. It’s a classic punch-down piece from a journal editor having a hard time coming to terms with the death of the publishing industry as we know it.

Readers of this op-ed masquerading as serious discourse will rapidly learn that the popular post-publication peer review portal PubPeer (8 Ps – count em!) is a mere “by-product of the social media age”. Straight away, he starts out dissing it as side-issue, as opposed to what it really is – a potential solution to a rampant problem foisted on academia by multi-billion dollar publishing corporations (one of which he happens to work for).

Readers are then treated to (wait for it…) the two-fold problems with PubPeer…

Problem 1 is simply stated as “Most commenters take advantage of the anonymity afforded by the site in full knowledge that their posts will be available to the public at large”.  There is no further attempt to explain why this is a problem, we’re just asked to assume that it is.

Problem 2 is stated bluntly as “The vast majority of comments that are posted focus on image data (gels, blots, and micrographs) that contribute to the development of scientific ideas but are not ideas in themselves”.

After stepping back from my monitor to avoid spitting coffee at it, I read that sentence again, and have read it several times since, and I can’t even begin to understand how someone purporting (pretending) to be a scientist could think in such a way?

To borrow another of Dr. Blatt’s phrases – “Let’s not mince words”.  What we’re talking about here is the editor of a major journal, claiming that problems with data are not a big deal, as long as the underlying ideas are OK.  Hopefully I don’t need to explain how dangerously at odds with the scientific method such a viewpoint is (hint – it’s all about the data; anyone can have ideas, but without data to support them, ideas are worthless).

Taken together, the two-fold problems (anonymity and mostly petty issues with figures) pose fundamental paradox:  If PubPeer is really as unimportant as he claims it is, why bother with an editorial such as this?

The good Dr. Blatt’s justification for his tirade against commenters who have an eye for dodgy data, is that “no journal club I ever organized or contributed to was so obsessed with the minutiae of data presentation”.  My only response can be to suggest he find a better journal club!

The article moves on with a series of poorly informed opinions. Take for example this gem…

“While there is no danger of public embarrassment for the commenter, likewise there is no opportunity to gain from a personal exchange with the author.

Does Dr. Blatt even consider the possibility that many commenters on PubPeer have already exhausted the polite channels he espouses?  Does he consider the fact that when confronted with problem data, many authors simply ignore, obfuscate, or go on the offensive (often hiring lawyers)?  The opportunity to interact with authors is presented as a panacea, failing to acknowledge the reality that such interaction is often fruitless.

He continues…

What is the rationale? Given that the majority of comments show the most petty kind of scientific criticism, can there be any doubt that the intent often is to pillory”.

Does it even cross his mind that a possible motivator might be the integrity of the scientific record? Does he stop to think that commenters may be tired of trying to compete with cheats?  No, it’s far easier to write it all off as a bunch of disgruntled scientific competitors whose sole motivation is the satisfaction of tanking someone’s career!  To put it mildly – many of us have bigger fish to fry (i.e. our own science).

This little gem seeks to put pre-publication peer review (i.e. the bread-and-butter of his journal) on a podium, somehow better than post-pub…

“I accept that there is a case for anonymity as part of the peer-review process. However, the argument for anonymity in postpublication discussion fallaciously equates such discussion with prepublication peer review.”

Given that many of the cases on PubPeer have found problems that should have been discovered during pre-pub’ peer review, there is a strong argument to be made that post- review is not only equal to pre-, but better!  Considering that both exercises are conducted for free, but the only one making a profit on the back of free labor is the pre-pub’ variety, perhaps we should equate them by paying people for post-pub’ peer review?

It goes on… he sides with Hilda Bastian in espousing “the importance of assessing whether commenters are outside their areas of expertise”.  This is a classic prat-fall of the entitled.  I’ll re-phrase it into plain English – Your opinion only counts if I deem you important enough to have an opinion.  Witness this discussion between myself and a senior scientist on PubPeer, in which my scientific credentials were considered as a topic worthy of discussion, instead of the actual data in question.  Quite simply, there are no rules regarding who is qualified to comment on science. Anyone who claims otherwise is engaged in protectionism/racketeering.

He then adds this beauty…

“So, whatever the shortfalls of the peer-review process, I do not accept the argument that it is failing, that it is a threat to progress, or that, as scientists, we need to retake control of our profession. Indeed, if there is a threat to the scientific process, I would argue that, unchecked, the most serious is the brand of vigilante science currently facilitated by PubPeer.”

So let’s get this straight – the problems facing science today are not: (i) a lack of funding,  (ii) rampant fakery, (iii) politicians seeking to defund things they don’t like, (iv) inadequate teaching of the scientific method in schools, (v) proliferation of the blood-sucking profiteering publishing industry, (vi) an obsession with impact factor and other outdated metrics, (vii) a broken training to job pipeline in academia, (viii) insert your favorite #scipocalypse cause here.

No, the #1 threat to science right now is vigilantes on PubPeer! Of course… if only we could just get rid of the idiots on PubPeer, the most serious threat to science would disappear and we could all go back to living the high life!  Sign me up!

He then issues a dangerous call to action…

“I urge scientists publishing in Plant Physiology and other reputable scientific journals not to respond to comments or allegations on PubPeer”.

My only suggestion here would be to ignore this “advice”, unless of course you plan on a long and protracted series of legal challenges or online battles, resulting in an institutional investigation and your eventual firing.

The way that “real” scientists respond when their data is questioned, is to answer the damn question!  Show the data. Produce the originals.  In case you hadn’t noticed, the front page of PubPeer cycles once every 3-4 days – if there’s an innocent explanation, you WILL be vindicated and your career will not end if you engage with the commenters.

To cap it off, in the ultimate punch-down, Dr. Blatt accuses the founders of PubPeer of unmasking themselves solely for the purpose of making money. This neatly sidesteps the huge amount of personal resources they poured into the enterprise from the beginning, and of course ignores that they will not gain anything personally from this new effort, because the organization is a 501c3 non-profit foundation.

It takes an exquisite amount of hipocrisy, to speak from the bully pulpit of an entitled publication, part of a multi-billion dollar enterprise, punching down at a non-profit foundation, and accusing it of being money-hungry.  The only possible motivation I can think of for this Op-Ed, is an editor and an industry witnessing the slow decentralization of their control over information (for massive profit), seeking to discredit an upstart grass-roots organization that might disrupt the status quo.

What a sad sad display of the death throes of an empire.

Such events usually do not end in the emperor’s favor.

 

Metabolomics, peer review, and an ode to the Langendorff perfused heart

Finally, our metabolomics paper is in press at J. Mol. Cell. Cardiol.  (email if you want a reprint).  TL/DR… SIRT1 drives most (~85%) of the metabolic alterations that occur in the heart during acute ischemic preconditioning (IPC).

This was quite a tough paper to get published. We started the project in spring 2013, and wrote it up in fall 2014. It got rejected from a big journal (IF>15) first, then went 2 rounds at a mid-level (IF>10) journal before being rejected again, and then it went 2 rounds at JMCC before acceptance. All told, a year of back and forth with reviewers and editors.

The model system we used to investigate this topic was the Langendorff perfused mouse heart and splitomicin, a pharmacologic inhibitor of SIRT1. The basic issue with the reviews that ended up as rejections, was an insistence by reviewers that we do things in-vivo and using knockout mice.

Normally, we’re big fans of moving toward more physiologically-relevant model systems, but in this case there are very specific reasons to use a perfused heart and a pharmacologic inhibitor.  Here are some key points…

(1) Regarding pharmacology, the inhibitor we used is one we’d already shown can block acute IPC, so it’s a good candidate to test whether it also blocks the metabolic effects of IPC. Also, we had already shown that a SIRT1 KO mouse heart cannot be preconditioned, and that the endogenous protection seen in the SIRT1 over-expressing transgenic mouse can be blocked by 5 min. infusion of the inhibitor. Thus, the time-frame for the effects of SIRT1 in IPC is very short – on the order of 20 min. The SIRT1 KO mouse has known long-term metabolic alterations which would mask any changes we’d look for in IPC.

(2) Regarding in-vivo vs. in-vitro, it all boils down to sampling time. In our system, we can clamp the heart in liquid nitrogen Wollenberger tongs, straight off the perfusion rig. In effect, it goes from beating to frozen in less than a second. That’s important for getting reliable information on metabolites such as ATP, NADH, GSH and other labile redox things.

The problem is, when you precondition a mouse heart in-vivo, it’s a focal ischemia model. Only part of the heart is ischemic (the bit downstream of the vessel you occlude), so if you try to dissect out the ischemic zone, you delay the clamping by a couple of minutes and destroy all the labile metabolites during the dissection. Alternatively, if you clamp the whole heart right out of the animal into liquid nitrogen you create 2 problems… First, all the changes in the ischemic area get “diluted” with the other part of the heart that wasn’t ischemic (the so-called “area not at risk”).  Second, you’re also sampling blood, so you don’t know if the changes you see are in the myocardial tissue or the blood that comes along for the ride (by our estimates when you clamp a heart out of a mouse, about 1/3 of the sample is blood). In contrast, the perfused heart system has no blood, so the whole sample is myocardium. Also the entire heart is ischemic, so there’s no dilution.

(3) The other major issue concerns the type of metabolomics analysis you want to perform.  In this paper, we performed not only steady-state metabolomics (i.e., measuring the relative levels of metabolites), but also 13C labeled substrate tracing. The latter can yield proxy information about metabolic flux, which steady-state measurements cannot. This is easy in the perfused system… just throw 13C-glucose or 13C-palmitate in the perfusion media, but in-vivo this creates problems. You can’t just deliver labeled substrate to a whole mouse and assume it’s only being metabolized by the heart on first pass.  For example, the cardiac/liver Randle cycle can result in labeled glucose being turned to labeled fat by the liver, then sent to the heart as fuel. Also, whatever 13C-substrate you infuse is going to compete with endogenous blood-borne substrates in the animal. In the perfused system you can swap out the whole substrate (i.e., replace all the glucose with 13C-glucose), so you have much tighter control over delivery.

So, this really is one of those cases where an abstract application of Krogh’s principle comes into play.  The in-vitro and pharmacology based approach really was the best system available to answer the question at hand (that question being, what fraction of the metabolic changes that occur in acute IPC are governed by SIRT1 signaling?)

Naturally, we argued all the above points and it didn’t get us anywhere! As a lab that routinely uses both in-vivo and knockout models, it’s rather frustrating to be locked out of publishing in certain journals because we chose to use an allegedly inferior system. It’s annoying that some journals have a myopic focus on knockouts and in-vivo data which precludes them from publishing otherwise solid work.  Thankfully JMCC seems to have a more sensible approach to this type of work!

 

Mito ROS Slides

Last week I had the honor of being a speaker at the “MiP” (mitochondrial physiology) school, in Greenville NC. The event is one of a long series organized by Erich Gnaiger (inventor of the Oroboros Oxygraph2k respirometry apparatus).  The meeting included a series of methods workshops and scientific talks from abstracts, as well as didactic lectures based on a framework of the book “Bioenergetics 4“, one of whose authors (David Nicholls) gave several lectures.

My lecture was on “Mitochondrial ROS generation”, a seemingly massive topic which cannot really be covered in any depth in 45 minutes.  But anyway, here are the slides (PDF), in case anyone might find them useful.