The third episode of The Tortured Proteins Department is out now!
We chatted about grant cancellations, exciting regional meetings and reunions, two fun new preprints, community norms around code release, and the importance of giving kudos.
The pre-prints discussed in this episode:
Former lab member Stephanie Wankowicz and I have started a podcast called the Tortured Proteins Department. The first episode is out now!
We discuss the declining support of science in the US and how it may impact the future of graduate science education.
Like many others, I was disappointed with the lack of code, or even executables accompanying the publication of AlphaFold3 in Nature. This made it impossible to test the most exciting claim of the paper: impressive performance predicting the structures of proteins bound to novel ligands. I was even more upset to learn that my colleague Roland Dunbrack was “ghosted” after he submitted his initial review.
We organized an open letter to Nature, questioning why a journal would fail to enforce its written policies. In doing so, Nature implies that it enforcess those policies inequitably and to the detriment of the overall scientific community.
In response to this letter, Deepmind announced they would release the code in 6 months. This was a reversal from their previous quote (from a Nature News article):
””““We have to strike a balance between making sure that this is accessible and has the impact in the scientific community as well as not compromising Isomorphic’s ability to pursue commercial drug discovery,” says Pushmeet Kohli, DeepMind’s head of AI science and a study co-author.”””
Now Nature has replied in an unsigned editorial, saying it wants to be “in conversation” with our community around ensuring openness of the research ecosystem. This editorial focuses a lot on code disclosure. Journals want to play an important role in the research ecosystem going forward and have established that they will perform some valuable services:
- coordinating peer review
- performing ethics checks
- ensuring data and code are properly deposited
The editorial makes it seem like this matter is only about a small execption regarding the code. And writes that the release of the code after a 6 month delay, spurred by the open letter and community outcry - not Nature, is:
“… an important step, and Nature will update the published paper once the code is released.”
Yet still, we cannot validate the most fundamental claims about protein ligand predictions. I find Nature’s description of the server disingenuous:
“The basics of how the community can use the new version of AlphaFold remain the same: anyone with a Google account can use the tool for free, for non-commercial applications.”
and
“In addition to the non-availability of the full code, there are other restrictions on the use of the tool — for example, in drug development. There are also daily limits on the numbers of predictions that individual researchers can perform.”
The server is restricted to 20 natural metabolites and ions. We still cannot even reproduce the figures of the paper.
Obviously, many companies want the Nature “stamp” of approval - this editorial shows, nakedly, that this “stamp” is a toxic part of our current research ecosystem, one that bends easily to corporate interests and applies inequitable standards. The canard that the private sector won’t publish if they don’t let companies play by different rules is particularly problematic. Nature broke their peer review process here (see Roland Dunbrack’s experiences) and with a little bit of community pressure, the authors changed course and promised an eventual code/executable release.
What’s the solution going forward? We can raise the bar! Academics should push the envelope in data and code disclosure alongside preprints with open review. Companies can also lead by example (see Arcadia Science, Pat Walters, and others) by doing a 1st class job of disclosing data outside journals.
I’m optimistic about the scientific ideas presented in the AF3 paper. It’s an exciting time for AI and biosciences. Let’s make the future get here faster by building on each others work!
This is an opinionated guide for how to set up IT infrastructure for a new lab.
It assumes that you have at least some computing background, though it should be possible to follow along without one if you do a bit of research whenever you encounter something you don’t understand.
Web domain
- Create an AWS account with your personal (i.e., not .edu) email address. Tie your personal credit card to the account so that it’s clear that you’re the one paying and that it’s not owned by the university. (Domain registration for .com and .org is under $15 per year)
- After logging in, navigate to “Route 53” (the name of the AWS domain registration interface)
- Register a domain name. I recommend sticking to either a .com or a .org domain. (.edu bars registrations for anything other than an accredited school)
- Aim for something short, memorable, and lacking weird characters (preferably only the 26 letters)
- Avoid having your university name in case you transfer to a different one sometime in the future
- Choose carefully, since it’s a giant pain to change domains once you start using them
I strongly recommend going with AWS over other providers like GoDaddy or Namecheap since they’re a multi-billion dollar business that won’t be going anywhere for decades.
Moving domains between providers is possible, but annoying.
AWS also has a reasonable API in case you want to do more advanced things in the future, like programmatically updating entries.
Website
- Create a personal Github account if you don’t already have one. Github user names and organization names occupy the same namespace, so you’ll need two different names (I’m calling them “example” and “examplelab” in the below examples)
- Create a free organization for your lab
- Create a repository for the examplelab organization (not for your user account) called “examplelab.github.io”
- Follow these instructions to create DNS records in Route 53 so that example.org loads the Github page automatically
I recently set up a new website for the Manglik lab here, which is generated by this Github repository.
There is a single source of truth for lab members, _data/authors.yml, which is used both to generate the members page and set blog post authorship.
To avoid any repetition, the publication list is generated from blog posts where the front matter contains publication: true, making it possible to both have standard blog posts and ones that announce publications while simultaneously creating an entry on the publications list.
See here for an example.
I recommend starting with the Manglik lab website as a template, and editing the contents as appropriate since it’s much cleaner than the repository that generates the Fraser lab webite; you’ll only need to edit the following:
_data/authors.yml to include your lab members
_pages/about.md to include your contact info
_pages/members.md to edit the “Joining” section
_pages/publications.md to edit the Pubmed link to your own name
- the contents of
research_/ to set your research interests
- the contents of
assets/images/ (but not assets/css/)
CNAME to match the URL of your website
- the paths in
README.md
- the top few entries in
_config.yml to set the site name, PI info, and site description
- the contents of
_posts/, which generates both the blog posts and publications list as described above
Theoretically, Github limits these pages to less than 1 GB (which you hit surprisingly quickly once you start adding article PDFs or high-res images), but I don’t think they enforce it.
Ideally, you’ll want to host anything over a couple MB separately, but that’s kind of a pain.
Generally, Github and large files don’t play friendly since Git maintains an append-only history which begins to add up when you’re adding and removing files.
Email
Consider registering for an email service on your domain so that you’re not tied to your university’s email infrastructure.
It will be hard to find one that less than $5 per user per month, which will add up quickly.
Most email services don’t support archiving accounts, so you’ll be paying that amount forever unless you’re okay with deleting everything.
I personally like Fastmail, which has a nice, snappy user interface, excellent support, and a free 30 day trial.
Its family plan supports up to 6 users for a flat $11 per month if paid yearly (discounted if you subscribe for longer).
They also pro-rate unused subscriptions, so when you exceed 6 users you can transition to a business plan that scales to an unlimited number of users at $5 per user per month without wasting money.
Topicbox
Topicbox is an email-based service that has inboxes designed to be shared.
It’s $15 per month for up to 50 users and for any number of virtual addresses, and there’s a three month free demo.
I recommend this since you can create one virtual inbox per vendor or group of people.
For example:
- labmanager@example.org, so that you have an address that doesn’t change when your lab managers change
- dms@example.org for all the people working on DMS in the lab
- ni@example.org for all your LabVIEW licenses, so that you don’t have to email whoever originally registered for the account after they’ve left the lab
- thermo@example.org for all your Thermo-Fisher warranty info
There’s a web interface that shows all the emails received for each virtual address.
Additionally, you can set it up so that users can subscribe to any subset of the various virtual inboxes and automatically receive a copy of any email received by those addresses.
You’ll need to follow the directions here to use your own domain instead of a topicbox.com domain.
You can’t easily host both the user emails mentioned above and the shared virtual addresses on the same domain due to limitations on how email routing works, so I suggest hosting it on a subdomain (e.g., box.example.org) while your primary email is hosted on the main domain.
Lab wiki
Lab wikis are great for storing general lab info, like an onboarding guide.
I really like Wiki.js, since you can set it up to sync with git; this allows you to update the wiki similarly to how you update your website in addition to the built-in editor.
The wiki files are all plain-text which means that it’s reasonably browsable through Github in case Wiki.js ever stops being developed and easy to port to a different wiki engine (e.g., docuwiki or mediawiki, which powers Wikipedia) if you ever want to.
You’ll have to host it yourself, which can be done either on DigitalOcean using an image pre-configured by the Wiki.js developers following this official guide or on AWS by following this community guide.
Right now, I’m working on setting up a pipeline that will enable us to take a recording from a meeting, convert it to text using a speech-to-text model, label each sentence by speaker, summarize it a couple paragraphs using the open-source Mixtral 8x7B, then upload it to the fully-searchable wiki by creating a git commit that’s pushed to Github and synced to the wiki, all without human intervention.
If you’re a member of the Fraser lab, you can check this out for an example.
Instant messaging
Uhh… Welcome to the land of only bad options, approximately ordered from least bad to terrible:
- Element:
- Pros: open source, based on standards, can be hosted on a custom domain, has quality mobile and desktop versions, and smooth collaboration with users on other servers (if you can find any outside of Wikipedia and open-source projects)
- Cons: you’ll have to host it yourself (which means that you can’t send or receive messages if something breaks) or pay someone ~$5 per user per month to host it for you
- Signal:
- Pros: popular, open source, and free (and likely to stay free since it’s supported by a non-profit foundation)
- Cons: designed mostly for phones (though there is a desktop app that can be linked to your phone’s account) and lacks separation between personal use and business use
- Discord:
- Pros: free (at least for now), and has quality mobile and desktop versions
- Cons: designed more for gaming and voice chat than business, and likely not to stay free forever
- Slack:
- Pros: popular with a refined UI
- Cons: nearly $10 per user per month if you want access to messages older than 90 days and cross-workspace collab is cumbersome
- Teams: absolutely not