Feasible

Feasible

Share this post

Feasible
Feasible
📈 #49 Latest news in Operations Research (🎃 Halloween edition)
OR news

📈 #49 Latest news in Operations Research (🎃 Halloween edition)

More PDLP, AI infrastructure, and real-world OR applications

Borja Menéndez's avatar
Borja Menéndez
Nov 03, 2024
∙ Paid
5

Share this post

Feasible
Feasible
📈 #49 Latest news in Operations Research (🎃 Halloween edition)
4
Share
Upgrade to paid to play voiceover

I’ve told you several times that the Operations Research field is constantly evolving!

That’s true, AI is faster than light, so not at the speed of AI.

In the past few weeks, there’s been a lot of news. For example, the new solver from Google, PDLP.

Let’s start from there. Today’s Feasible post will be about:

  • 🧪 Is NVIDIA testing PDLP and GPUs?

  • 🕸️ INFORMS and the importance of graphs through Walmart’s vision.

  • 🌪️ Google Cloud’s AI infrastructure and its impact on OR companies.

I’ll also share why traditional solvers like Gurobi should adopt PDLP.

Let’s get started! 🪂

🧪 NVIDIA testing PDLP on their GPUs

You’ve read about PDLP in Feasible at least 3 times.

It’s the most important recent development in mathematical optimization solvers 🧩.

What a wonderful era to be alive!

If you haven’t read about PDLP, start here to understand its implications for solving large-scale LPs using GPUs. Then, continue here to see its performance against Gurobi. If you’re more interested, read the original blog post and paper from Google.

I understand you haven’t been isolated from the world lately, have you? 🌍

NVIDIA hasn’t.

They’re testing PDLP.

They integrated PDLP into their cuOpt library and found interesting things. Want to read it? Go here!

Almost a year ago, two University of Chicago researchers tested the potential of PDLP using GPUs through the Julia programming language.

Their main conclusion was awesome:

The numerical experiments show that the prototype GPU implementation cuPDLP.jl can match the performance of commercial solvers like Gurobi and outperform them on large instances. This reveals the potential of using GPU to develop high-performance optimization solvers.

Read the full paper here.

My ✌🏻 two cents. Should traditional solvers like Gurobi embed PDLP in their workflow?

Keep reading with a 7-day free trial

Subscribe to Feasible to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Borja Menéndez
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share