I like the paper by Andrew Kensler on tree rotations to optimize trees with simulated annealing. (there's a lot of good stuff at his site if you click his name). It reminded me of this old talk by Sedgewick on Left Leaning Red Black Trees ; I can't remember if I linked that up when I first read it, it's very elegant and a nice clean way to understand self-balancing binary tree algorithms. Not really super related to the Kensler paper, just fired that neuron for me.
The simulated annealing thing is a fun approach. It can be used on any kind of problem that you can quickly find an approximate greedy solution to, but where the true global optimum search is very hard, but you can locally measure how a change affects the global cost. It should now be obvious to the reader that the LZ optimal parse we just talked about is one of these problems. If you think of it as a shortest path search, you can see that you can seed the path with a greedy normal parse to get an initial decent solution, then you could consider making local changes and see if they improve it, like replacing any match with a shorter match and then filling in the blank with the best match there. If you just make all the local improvements that improve data rate, you will get closer to the true optimum but you won't reach it because you'll be stuck in a local minimum, so simulated annealing to the rescue to let you agitate out of the trough and temporarily screw up your LZ code stream before you settle back down.