I am currently studying information transmission in biological neural networks at a (fairly prestigious) university, specifically, the interaction between neurons and their surrounding glial cells. Most of my research is about modeling and simulation of these cells as dynamical equations. It is hoped through a better theoretical understanding of the brain, one day this line research will allow us to build biocomputers that far outperforms any software models existing today. There are far more things that the brain can do but machines cannot, e.g., consciousness, selective memory, imagination.
However, in recent years I find that I am becoming more and more disillusioned with my purpose of doing graduate research, and my disillusionment is shared by several of my colleagues.
This disillusionment is specifically driven by the rapid commercial success enjoyed by industries due to the advent of exponentially increased computing power and data (along with many other factors). For example, wearable computing, AR for gaming, speech and image recognition and personal robots, etc. There are four “trends” that continuously distract me from being fully committed to my research.
It seems that most of the groundbreaking research is performed at research labs at large tech companies such as Google, Baidu, Facebook or major hospitals, instead of universities. These companies have the money, the resources, the public relation, the infrastructure as well as the aggregation of brains to produce truly innovative products, and their products tend to have greater and more immediate impacts on people’s lives. It doesn’t help that historically academic journals are publishing these success stories on a regular basis.
People in the industry are slowly but surely enjoying greater and more immediate name recognition as compared to people in academia. In fact, it seems that industry researchers are seen as more “successful” as compared to their strictly academic counterparts (especially those without the prestigious title of “Company-X’s Professor”). Many companies proudly display their list of researchers (on well-maintained websites no less!).
Sure, fame should not be one’s motivation for producing good science. But we all wish that our research results could be more widely recognized.
The ladder from academia to industry is too high and is becoming even higher. Many industry jobs (even in simulation work) require years of industrial experience, primarily in coding and writing software. These skills are not emphasized in much of the research departments outside of computer science. Knowing how to perform parallel computing in C++ is not an essential skill in creating clear and well-reasoned simulations or supporting graphics, nor advance the state of research and ultimately, human knowledge. Therefore even one graduate with a PhD, he or she will be put at a significant disadvantage as someone who started out doing and this trend is not abating. In sum, the transferable skills obtainable due to industry experience far outnumber the ones that can be acquired in academia.
Money (- coupled with interesting research opportunities). For example, according to some self-reported salaries, machine learning jobs are paying 100k – 200k and even 600k at the high-ends. While money has never been a strong motivator for me, it seems that with this type of funding, any type of research could be greatly accelerated without the stress of applying for grants and seeking scholarships that pay 1/4-th of what you can obtain with holding a job in a company that potentially does its own interesting research.
Note that this is not an argument as to why one should quit research and join industry on the basis of their success, but rather aligning one’s skill set or research towards industry/commercial purposes instead of performing the type of research without industrial demand. Simulating biological systems and analyzing related dynamical models is not as strongly in demand as knowing software engineering or machine learning, I am not sure if this trend will ever change.
I have heard many counter-arguments to these points. For example, research in the industry tends to be product driven and ultimately contributes nothing to advancing knowledge.
Or much of the commercial success enjoyed recently are the result of engineering without the science behind them.
Or that we are at the peak of a hype-cycle, and things will eventually die down, as for every success story there are hundreds of failures.
Or what my supervisor believes: good research is independent from where it is produced.
But I find it difficult to recognize the increasingly unfair trends and I cannot help but feel that academicians are devalued. Will the research that we perform at universities will ever be recognized (in our lifetime)? Or have an impact comparable to what industries do on a regular basis? Can my talents be put towards better use and am I wasting my time? Should I change my research directly to pursuit skills that are more valuable in industry? Should I quit graduate school and work in industry full-time?
These questions distract me from focusing on my research, especially the more theoretical aspects, and I find that I am constantly making choices that deviate from my research goals such as by partaking in classes that form the basis of these commercial products. My copy of Winfree’s “the geometry of biological time” is slowly gathering dust, while the copy “How to Program in Python” sitting by my bed side. How can I maintain a good attitude towards doing research while seeing all these rapid success enjoyed by large and well-funded companies?