Lecture 20 - Manufacturing

Lecture 20 - Manufacturing

Last time I rapidly sketched some applications of resource theories to chemistry and scheduling. If you're interested we can dig into these more deeply once we've learned more of the math in Chapter 2. For now let's look at one more application:

Manufacturing. Companies often have to solve puzzles like this, though usually much harder:

Puzzle 58. You are making two kinds of computers: laptops and desktops. Each kind of computer needs 1 processing chip. A laptop also needs 1 memory chip, while a desktop needs 2 memory chips. You have 10,000 processing chips and 15,000 memory chips. Each laptop you make will give $750 of profit, while each desktop will give $1000 of profit. What is the most profit you can make?

Give it a try!

Of course this problem is simplified in many ways. We can make it a tiny bit more realistic by recognizing that it also takes time to make computers:

Puzzle 59. Everything in Puzzle 58 still holds, but now your workers have only 25,000 minutes to make all the computers. It takes 4 minutes to make each laptop, and 3 minutes to make each desktop. Now what is the most profit you can make?

These puzzles secretly involve resource theories. The resources you're starting with are:

  1. processing chips
  2. memory chips
  3. time

You are trying to transform these into some other resources:

  1. laptops
  2. desktops

and then transform these into yet another resource:

  1. profit

So, we can write some "reactions", as we did last time for chemistry:

$$ \textrm{[processing chip]} + \textrm{[memory chip]} + 4 \textrm{[minute]} \to \textrm{[laptop]} $$

$$ \textrm{[processing chip]} + 2 \textrm{[memory chip]} + 3 \textrm{[minute]} \to \textrm{[desktop]} $$

$$ \textrm{[laptop]} \to 750\textrm{[profit]} $$

$$ \textrm{[desktop]} \to 1000 \textrm{[profit]} $$

Then we can ask what profits are reachable if you start with

$$ 10000 \textrm{[processing chip]} + 15000 \textrm{[memory chip]} + 25000 \textrm{[minute]} .$$

This shows that the second puzzle is a special case of the "reachability problem" that I described in Lecture 19. So is the first puzzle.

However, you don't need to know anything about the reachability problem to solve these puzzles! Ordinary math and some careful thought are enough.

I won't do these puzzles for you, but here's a hint. Let \(x\) be the number of laptops you make, and let \(y\) be the number of desktops. You are trying to maximize some function of \(x\) and \(y\), namely your profit. However, there are some constraints on \(x\) and \(y\). So, draw the region in the \(x y\) plane that's allowed by these constraints, and use that to help you see which choice of \(x\) and \(y\) maximizes the profit.

If you do this, you'll have reinvented the basic idea of "linear programming". (If you already know about linear programming, or these puzzles seem very easy, please let other students post their answers first: it's great for people to invent math on their own.)

Why is this idea called "linear programming"? It's because your profit is a linear function of \(x\) and \(y\), and all your constraints are equations or inequalities involving linear functions of these variables. This makes things easier.

Many industries use linear programming to solve problems involving enormous numbers of variables. Linear programming is good for many things, including routing, scheduling, and assignment problems.

Because of this, there's been an enormous push to improve algorithms for linear programming. Some of the basic ideas go back to Fourier: in 1827 he figured out how to eliminate variables from a system of linear inequalities. But linear programming really took off in 1939 when the economist Leonid Kantorovich developed it as a method to reduce costs for the Soviet army and increase losses incurred to the enemy in World War II. About the same time, the American economist T. C. Koopmans figured out how to formulate many economics problems as linear programming problems. Kantorovich and Koopmans later shared the Nobel prize in economics for this work.

One of the most important developments came around 1947 when George Dantzig, working for the US Air Force, was trying to find the best assignment of 70 people to 70 jobs. To do this he reinvented linear programming and invented the first efficient algorithm to solve these problems: the simplex algorithm.

The simplex algorithm runs in polynomial time on average, but in the worst case it can take exponential time. The first polynomial-time algorithm for linear programming was found by Leonid Khachiyan in 1979: it's called the ellipsoid algorithm. But this method is very slow in practice. A better polynomial-time algorithm is Karmarkar's algorithm, discovered around 1984. AT&T tried to patent this algorithm, and that led to a big legal fight about whether math can be patented!

Given all this hard work, it's unlikely that resource theories will help anyone with linear programming... at least in the short run. It's more likely that seeing linear programming as part of the subject of resource theories will give us new ideas, by making new connections apparent. That's how category theory often works.

There are also applications of resource theories to other aspects of economics - as well as thermodynamics and quantum computation. But all these take more work to explain, so next time I'll just dive into the math of resource theories. If you want to get ready, read Section 2.2.1 of Seven Sketches. I'm eager to get started!

To read other lectures go here.

© 2018 John Baez