The linked article and the title are pretty misleading. But if you go and look at the abstract for the paper, they mention that the extra energy comes from pumping heat out of the environment.
Looking at the figure attached to the article, you can see that the only > 100% efficient trial was with ambient 135 degree-C temperature (two other trials at 84 degree-C and 25 degree-C were much less efficient). The extra energy comes from the environment.
I'm not sure if this would work the same way, but I know that some high voltage LEDs are actually just strings of multiple LEDs in series [1]. Though it clearly wouldn't reach the inflated efficiency in this article, this could lead to a chip with a large number of very very small LEDs reaching a higher efficiency than a standard LED. It obviously depends on the manufacturing process though.<p>[1] <a href="http://www.cree.com/products/pdf/led%20arrays.pdf" rel="nofollow">http://www.cree.com/products/pdf/led%20arrays.pdf</a>
Wow: 69 whole picowatts of light if the ambient temperature is about 200F. And fundamentals of the physics mean that neither of those figures is likely to change.<p>If the claim were more spectacular, we'd call this snake oil.
So could you just have a microarray of millions of these to get a practical amount of light at an insanely-low electricity cost? (And given jnhnum1's clarification, also be cooling the room slightly?)