Science has recently begun to establish some of the tools that might let us develop a form of synthetic life. Developing cells from scratch ought to let us understand a whole lot more about what actually constitutes a living organism, while making it possible to generate simpler (yet no less sophisticated) life-like organisms that can be more predictably manipulated.1
Cryogenics is one of the most important fields that has been integrated into biomedical research. It’s employed to store a variety samples, including human tissue specimens, blood samples, and primary cells, making cryogenic storage an essential tool for hospitals and research facilities alike. Here, we’ll briefly explore how the field of cryogenics has developed within the last century to produce the storage equipment used throughout the world to perform ground-breaking research and to discover new medical advances.
Climate change is a global phenomenon with wide-ranging and potentially disastrous effects for the entire human population. The consumption of fossil fuels (e.g. coal, oil, and gas) combined with mass deforestation has led to exorbitantly high atmospheric CO2 levels that were only last recorded 800,000 years ago. These high CO2 levels have resulted in a significant increase in the average global temperature, a key factor that has led to the polar ice caps melting at an accelerated pace, making the seas warmer and sea levels higher.1 Heat waves are much stronger than they used to be, record-breaking hurricanes occur much more frequently than before, and we’ve lost nearly 60% of the world’s wildlife.2 It’s been well-documented that these changes are a result of human activities, as worldwide economic and technological progress has led to a consistent increase in the amount of CO2 in the atmosphere. Altogether, this has led to a rise in the average global temperature of nearly one degree Celsius since 1901, with the rate of global warming having doubled since 1975.3
Many industries require barcodes to track their inventory, samples, and equipment. To integrate the data from the barcodes into a tracking system, the barcodes must be scanned when each item is processed. So, how do scanners relay the information from barcodes to a computer?
Histology has evolved considerably since its beginnings in the 17th century, with advances in both specimen processing and analysis. Consequently, histology departments now face increasingly larger workloads. To adapt, they have integrated automated systems, which save time and allow histology professionals to work on other skill-based tasks, while maintaining enough flexibility to process and stain according to the needs of the medical or research lab. Here, we’ll explore how automation has been integrated into histology to speed up the workflow of both medical technicians and researchers.
Whether you have banks of cell lines stored in liquid nitrogen or assay reagents constantly consumed, managing your inventory is necessary to keep your lab running smoothly. That means having processes and workflows in place to guarantee the lab is working at peak efficiency, as well as having the proper material and infrastructure to track and manage your assets. Below, we’ll discuss some of the ways you can efficiently manage your inventory and keep track of everything in your lab.
Errors occur every day in healthcare institutions and research facilities. Medical lab errors can be very costly, setting hospitals back hundreds (sometimes thousands) of dollars for every mislabeled sample, causing irreparable harm to the physical and mental health of the patient. Errors in research also have a broad impact, skewing results and wasting precious materials—which are often irreplaceable—and years of effort.
So, you’ve decided to purchase a set of labels and a label printer, but you haven’t figured out which software you should install to design your labels with. There are several different options, from basic software that comes with the printer to specialized software, such as BarTender™ and Label Matrix™, each of which can be used on their own or as part of a laboratory information management system (LIMS). Here, we’ll review some of the pros and cons of each option.
Artificial intelligence (AI) has been a popular topic ever since it was introduced in 1956 by John McCarthy. It quickly captured the imagination of Hollywood, leading to many blockbuster movies being made using AI as a plot device, including the Terminator franchise. However, until now AI has remained as only science fiction, as it’s only recently that computers have become powerful enough to integrate AI into something appreciably functional, allowing some of the top companies in the world, such as Google, IBM, and Apple, to design systems that learn on their own. Gartner, a research and advisory company who publishes a yearly list of the most hyped technologies (termed the Gartner Hype Cycle), has placed AI-associated technologies at the top of their list.1 With companies like PathAI, Freenome, and Benevolent AI all entering the market, it hasn’t taken long for scientists to adapt AI to solving complex biological and medical problems as well.
CRISPR/Cas9, originally discovered in 1987 by a team of Japanese scientists and later refined by Jennifer Doudna in 2012, is a gene-editing tool that can cut and paste any genomic sequence, either in vitro or in vivo. It’s a system that relies on clustered regularly interspaced short palindromic repeats (CRISPR) to recognize foreign DNA and is mainly used in bacteria to fight off viral infection. This tool has garnered a lot of attention recently as researchers have tailored CRISPR/Cas9 to edit animal genomes in ways that were previously impossible or inefficient, revolutionizing genetic and biomedical research. CRISPR/Cas9 has become a crucial resource for labs who require stable cell lines or mice with knockouts, knock-ins, or gene mutations, able to drive constitutive gene activation or to edit micro-RNA and long-noncoding RNA.