đ Optimization or optimisation?, Multi-Objective learning to rank, Minimum Cut
Local Optimum: short, imperfect-yet-useful ideas - Edition #17
Welcome to a new edition of Local Optimum: a short, imperfect-yet-useful collection of ideas related to optimization, decision-making, and applied Operations Research.
Letâs dive in! đȘ
1) đ Optimization or optimisation?
As you may already know, I tend to use optimization.
In fact, I never use optimisation.
Itâs like I feel that Team Z resonates more. But Iâve seen it written both ways and some people are also in Team S, so I wanted to know how many of us are in the correct path of OR history đ
How do you spell it?
(click in the image to go to the poll)
2) đ€čđ» Multi-Objective learning to rank
Amazon has recently released MO-LightGBM: A Library for Multi-objective Learning to Rank with LightGBM.
Itâs the first open-source framework that brings multi-objective optimization capabilities into LightGBMâs gradient-boosted decision trees, enabling you to optimize ranking models against multiple criteria in one go.
Why would you care?
There are some applications here, but as Iâm focused lately in the hybridization of Machine Learning and Operations Research, Iâd say i) warm-start solvers by predicting good initial solutions (Ă la GNN as we saw the other day), ii) predict good branching decisions for MIP solvers (if youâre building one or creating your own heuristics), iii) bias neighborhood selection toward promising regions of the solution space (for metaheuristic guidance).
If you want to read the paper, here you have it.
3) âïž Minimum cut
I love when big companies succeed in improving the way we solve problems or give you new tools for that.
Thatâs why every time I see Google, Amazon, NVIDIA and the like propose new methods, I actively promote them.
Some months ago, Google proposed a new algorithm to solve the Minimum Cut Problem with a 3-step process:
Cut-preserving graph sparsification
Connect the Minimum Cut Problem with low-conductance cuts
Partition the graph into well-connected clusters
Research has been conducted on this problem for more than 7 decades, and it is now when we have a very efficient and generalist algorithm to solve large-scale problems.
Read more here:
Adding more hardware will improve your solutions?
Iâm continuing the Where did the time go? series of posts this next Monday.
Remember this series of posts cover different aspects of tractability issues in optimization problems.
This time, weâll cover:
đ€ Why weâve been stuck with CPUs
đïž What faster hardware is on the shelf?
đŻ Extra computational power: what can we still do?
If you want to understand how modern hardware fits into your optimization problems, this will be useful. See you Monday!
And thatâs it for today!
If youâre finding this newsletter valuable, consider doing any of these:
1) đ Subscribe to the full version: if you arenât already, consider becoming a paid subscriber. Youâll get access to the full archive, a private chat group, and 30% off new products.
2) đ€đ» Collaborate with Feasible. Iâm always looking for great products and services that I can recommend to subscribers. Also, if you want to write an article with me, Iâm open to that! If you are interested in reaching an audience of Operations Research Engineers, you may want to do that here.
3) đ€ Share the newsletter with a friend, and earn rewards in compensation. Youâre just one referral away from getting The Modern OR Engineer Playbook: Mindset, methods, and metrics to deliver Optimization that matters.
If you have any comments or feedback, just respond to this email!
Have a nice day ahead âïž
Borja.