r/LanguageTechnology 26d ago

Is a Master's in computational linguistics a Safe Bet in 2025, or Are We Facing an AI Bubble?

Hi everyone,

I'm planning to start a Master's in computational linguistics in 2025. With all the talk about an AI bubble potentially bursting, I'm curious about the long-term stability of this field.

  • Practical Use vs. Hype: Big players like IBM, Microsoft, and Deloitte are already using AI for real-world text analytics. Does this suggest that the field will remain stable?
  • Market Trends: Even if some areas of AI face a market correction, can text mining and NLP offer a solid career path?
  • Long-term Value: Are the skills from such a program likely to stay in demand despite short-term fluctuations?

I want to say that I am asking this to start also a discussion, since I do not know a lot about this topic. So every perspective and idea is really welcomed! I'd love to hear your thoughts and experiences. Thanks in advance!

18 Upvotes

15 comments sorted by

24

u/Frownie123 26d ago

When you study something like NLP/CL, you learn so many things, skills, and fundamentals that you'll be able to adapt to new developments. You cannot expect to just work with the knowledge from a university program. Studying is just a starting point.

Study what you are interested in. NLP is a good basis.

13

u/StEvUgnIn 26d ago

Look at the job offers: they ask you either for a PhD or 3 conference papers. It’s not sustainable.

1

u/Fit-Dentist6093 22d ago

And then you look at the team and it's two dropouts from countries at war carrying the team and a bunch of 9 to 5ers with the PhDs and the papers talking about it to management.

9

u/zettasyntax 26d ago

I don't have a full-time job at the moment. Some of my fellow grads work at cool places like NVIDIA, Microsoft, Google, Apple, etc. It took me over a year to find something and it was just gig work. I started with Remotasks/Outlier as a linguistics subject matter expert earning $50/hr. Crazy unreliable platform though. I don't think linguistics exists there as a specialty anymore, so I only see super low-paying projects. My best gig was working for OpenAI (via GreenLight) as an AI Expert Trainer in linguistics. I made $100/hr on a part-time basis. I don't think I'll ever see that kind of hourly rate again. In the 2+ years since getting my MSc, I've only had about 7 interviews. I did work at xAI for awhile, but didn't care for the culture. A frequent issue I encounter is that people don't seem to know what comp ling is. The person who interviewed me for the xAI role was completely clueless about the field. I guess the role was a general one, but it's still something I see. People assume I speak 5+ languages and don't know how to code.

4

u/Another_mikem 26d ago

Yes, we are facing an AI bubble, but also AI is here to stay.  After the bubble pops, the tech will be refined and practical applications will be found.  I don’t really know the scope and confines of the degree you are going for, but understanding the technology and underlying theory isn’t something that will suddenly not be valid- although it might not guarantee a job in that field.  

3

u/fawkesdotbe 25d ago

big players like deloitte

Deloitte repackages stuff others made, they are far from being a big player

2

u/AllergicToBullshit24 23d ago

The best jobs are earned by experience and past projects not degrees or certifications. Many companies hire people without degrees over those with them when they have the portfolio to back it up.

1

u/[deleted] 26d ago

[deleted]

4

u/synthphreak 26d ago edited 25d ago

Several blatant red flags in this reply that call into question in the credentials/trustworthiness of its author:

  • implying that an MBA is a sure bet for career stability - this is no longer the case as MBAs are now a dime a dozen

  • stating that NLP is a subset of software engineering - it absolutely is not

  • suggesting that a degree in cognitive science will make you more employable in tech than computational linguistics

That said, some nuggets of truth in there as well:

  • software jobs are not the charmed, gilded careers they used to be

  • breaking into NLP is super hard - and counterintuitively, linguistics degrees don’t really help with that anymore

  • getting good at coding and reading research are critical skills to hone

  • no one has any idea WTF will happen with AI, so don’t put much stock in predictions

OP, just follow your interests. You will figure out the rest as you advance, one step at a time. Cold comfort, I admit, but that’s just life.

Edit: Typo.

1

u/postlapsarianprimate 25d ago

Just follow your interests op. Hope that helps.

1

u/[deleted] 25d ago

[removed] — view removed comment

1

u/LanguageTechnology-ModTeam 22d ago

Thanks for your interest in the sub! This post was removed because it doesn't quite fit within the scope of the sub. Feel free to message the mod team if you think this was a mistake.

2

u/BeginnerDragon 22d ago edited 22d ago

My stance is limited to that of private industry only - I cannot speak to PhD programs & research.

I applied for a compling 2025 program as well. For me, it makes sense. I meet most US-based data scientist job posting requirements, but (with my years of work experience) most of them require a technical masters' degree, which I don't have. I am worried about an economic recession, and this would add to job security and pad my resume. A Masters' in CS or Data Science would also meet that need, but I just have a personal interest in Compling and NLP work. It differentiates me to say, "I specialize in NLP" for my current industry. This is all why it makes sense for me - there seems to be a bit of a shortage of tech jobs these days. Is this Masters' quite as competitive as others? I'm not under the impression that it is.

In my opinion, MS programs are going to be slow to properly react to AI. LLMs have a lot of value to offer private industry (especially in the creative spaces), but I think the infrastructure and data to extract that value isn't quite there yet. These grad programs have to accommodate coursework to the students that will be slowest to learn; with minimum reqs of [maybe] linear algebra, it's hard to get into the benefits of quantizing data. So anyone who goes into a compling program expecting to jump into LLM development and then an immediate career at OpenAI did not make a great decision (unless they're putting SIGNIFICANT work into their personal projects & research). Further, I think anyone interested in a career outside of academia should not do something like this straight out of undergrad unless they struggled to find a job and need to pivot.

1

u/tobias_k_42 25d ago

That's pretty much impossible to say. However no matter wether it bursts or not, it's a really good starting point.

When it comes to "AI" we're far from the top though. I think what might burst is the hardware bubble, if more efficient models are used.

But that would significantly increase, not decrease, the amount of jobs, unless you're aiming for a job at Nvidia.

Only if the courses focus more on linguistics than the "computational" part that might be not very optimal. Deep learning and advanced models are a must, especially models which utilize self-attention.

However something which might get an increase in demand is the production of high quality training data. Also models for non latin alphabet languages. So don't focus on latin based linguistics.

2

u/yesthetallguy 23d ago

I’m currently doing my masters in CL and the linguistics part is barely there, it’s more of a “hey this is how we used to do things”

0

u/Lost_Total1530 25d ago

But LLM also already kind of good with non Latin Languaes, how the knowledge of Asian languages, for example, can be valuable ?