Advertisement

Energy Saving Forum

New ideas for efficient power conversion, motor control, and lighting applications

CONVENED AND MODERATED BY JIM HARRISON

In December, Electronic Products convened its second forum on energy saving designs. As we say in our Energy Saving Initiative monthly series of articles, we believe the design-engineering community should take a leadership role in environmental responsibility by developing energy-efficient products. To that end, we asked six industry leaders for there insights into the latest design methods and key opportunities in this area. The complete forum is about twice as long as published here and con be found on our web site, www.electronicproducts.com.

Energy Saving Forum

Eric Persona (Director of Technical Education, International Rectifier): After reading your initial call to put this forum together it occurred to me that we in the engineering and devices community have a lot of different solutions. But ultimately whether they’re implemented or not gets down to market acceptance and energy costs.

Electronic Products: Right, so we may need a bit of a reality check.

Energy Saving Forum

Ron Vinsant (Senior Field Applications Engineer, Fairchild Semiconductor): I think that most of the designs are driven by economics. It’s not just a technology play of some sort.

Electronic Products: Yes, it’s also driven now by corporate responsibility.

Ron Vinsant: Going green?

Electronic Products: The way that we want to be perceived. I guess we could call it a guilt complex.

Energy Saving Forum

Bruno Baylac (Director of World-wide Marketing, Industrial/Consumer, Freescale Semiconductor): As you know in Europe it’s a little bit more than the marketing. The legislation is changing, there are new laws that we’re seeing that are addressing environmental concerns and the global warming, and of course energy efficiency is a very hot topic right now.

And in Germany, France, Italy, the UK, there are new rules which will be applied in the coming years, which will have significant impact on electrical and electronic equipment.

Energy Saving Forum

Andrew Smith (Product Marketing Manager, Power Integrations): Yes, I think that’s important, and I think you can take that a little further and look at the proposals for Energy Star 2.0 coming in July. And that’s going to play into making some improvements in efficiency mandatory. So this call for ethics isn’t going to be the only driver.

Ron Vinsant: I think that as we go through this we can make some points where it is not just a guilt complex, it’s terawatts of power that can potentially be saved, on a worldwide basis. And besides green guilt, there’s the amount of money that organizations have to spend for energy and those costs continue to rise. It is certainly going to be even more compelling in 10 years.

Energy Saving Forum

Stephen Oliver (Vice President, Marketing & Sales, Vicor/V•I Chip): We touched on the legislation. One point is that it’s fragmented across the world. Obviously, you’ve got things like EnergyStar, which is not compulsory, but is becoming consumer driven. If you don’t have this sticker then, some customers will go elsewhere.

There are compulsory things that are in place in Japan, Europe, and China like PFC or other THD regulations which don’t apply in the U.S. And there’s this big question of why doesn’t the U.S. have a PFC standard?

So as well as the legislation itself, there’s an element of fragmentation.

Eric Persson: I’d like to just comment on the PFC. I was just reading through an EPRI report, they had a workshop on energy efficiency back in February of ’07.

And you know one of the topics on PFC is it’s great in terms of allowing the utility to more effectively transmit and distribute electricity to end users, because there’s less reactive power, more real power. But it doesn’t actually improve energy efficiency, so there’s actually a slight energy penalty. So it’s good for utilities, but . . .

Electronic Products: Not good for the end users.

Eric Persson: Yes, exactly.

Stephen Oliver: I disagree with that, Eric.

Ron Vinsant: I disagree with that, too.

Stephen Oliver: If you have PFC, you therefore create a very nicely controlled, say, 380-W bus, which means the downstream components, whatever topology and whatever kind of semiconductors you use, doesn’t have to be oversized. Whereas if you don’t have a PFC stage at the front end, then you end up oversizing the FET voltages, therefore the RDS goes up, therefore the losses go up.

So I would contend that given a nice stable PFC voltage, the downstream equipment is much more efficient, and therefore overrides any potential loss in the PFC stage.

Electronic Products: Right, but it probably is a draw if anything. I know one office building in the Far East where they put in power factor correction at the front end, and that allowed them to reduce the size of the copper wires throughout the building, which is a savings too, but not an energy saving.

Ron Vinsant: I just want to point out that for a wide operating range, say 85 to 264 V, that power factor correction allows the broadband efficiency to be higher than a non-power-factor-corrected design as a rule.

Electronic Products: Broadband?

Ron Vinsant: Yes, what I mean by that is that if you look across the 85 to 264-V range, you’re going to be better off than trying to do a similar design than without PFC.

Andrew Smith: You’d think there would be a penalty for PFC when there is any no-load operation, or for the standby application. Possibly some of the larger PFC power supplies don’t see a no-load application, but certainly for control functions and those kinds of things where the power is slightly lower, you’re playing more for standby. I know some of the areas we’re going to look at, things like elevators and that kind of thing, spend a lot of their life not actually delivering any operating power, but are in standby mode. So that’s something we should probably consider.

Electronic Products: So, Eric, the power factor corrector in that case, let’s use an elevator, during idle it’s going to use quite a bit of energy, or not?

Ron Vinsant: It typically uses more, it depends on the architecture of the power factor corrector also.

Electronic Products: Okay, but in some cases, in standby the power factor corrector isn’t going to use much energy?

Stephen Oliver: In some cases, it’s completely bypassed, like in a lot of big TVs, you probably have a switch or something in there that is a bypass in standby, so it actually shunts it around the main power train.

Eric Persson: On the elevator side of it, they usually use a dual inverter so you can have power factor correction at any load and either direction of power so they can essentially regenerate back into the line.

Electronic Products: Oh, I didn’t know they did that with elevators, so going down I’m producing energy. I’m a hero!

Stephen Oliver: Well you’re not saving energy because you use some of it on the elevator to make it go up.

Electronic Products: Oh no, I took the stairs.

Bruno Baylac: I’ve seen that PFC is basically an important aspect, but I think we should take a broader look. And, we all know that there is a significant opportunity to save electricity and that motor control is a significant part of the electricity consumption and we also all know that we better control the motor, more efficient motor we can find some tremendous savings.

So there are plenty of opportunities, such-as by adding intelligence in motors, and in better lighting systems, and even in the electricity distribution, where we can save a lot of electricity. And this is very significant.

Electronic Products: Yes, but we need to go back, I want to make sure that power factor correction is done with.

It looks like we’re saying that it has certainly a benefit to the power company and to the power lines and overall electric benefit, but not necessarily much in the way of true energy savings.

Ron Vinsant: Some, and let’s take that one more. Remember power factor is really about harmonic elimination.

Eric Persson: Yeah, agreed.

Ron Vinsant: And then the thing that you get from harmonic elimination is you minimize the current in the neutral line of the distribution system in a building.

And what’s important there is that when you’ve produced the third harmonic there’s a lot of heating in the transformer. Every time we produce this additional heat the HVAC system has to remove that heat from the building.

Sometimes the heat is useful, but most of the time it’s not. So for every watt that you put into the building, you typically also use a watt to get that extra heat out. So the savings that you get by taking that third harmonic out of distribution transformers can be significant.

Eric Persson: Yes, I agree with Ron. That’s a good point, and especially in lighting systems where you have a big box store with thousands of fluorescent lighting ballasts that are all interconnected on a three-phase system.

There third harmonics can create a lot of heat. They can actually add up to greater than the fundamental current, so there is a tremendous amount of heating with no delivered energy benefit.

Electronic Products: All right, so I wanted to cover industrial area which of course includes large motors, lighting, home appliances, and especially computers and servers, and then there’s a couple of secondary issues that maybe we’ll have time to take on. Can we talk about larger motors, industrial motors, most of which are probably running 24/7, right?

Bruno Baylac: Motors are definitely a major consumer of electricity, both in industrial and in general. The number is between 60%, 70% of electricity consumption going into motion and motors.

In terms of industrial applications, you mentioned big motors, but everything is related to pumps, and airflow conveyers. As you mentioned these things are working 24/7. What we are hearing from our customers is that maybe 5% of industrial motors are using sophisticated electronic control.

And we know that we can significantly improve the efficiency of these motors by using more sophisticated and more costly electronic control and variable speeds but, if you do the math, this is presenting a significant savings opportunity. And the overall cost, incremental cost that you’re putting at the beginning will be repaid rapidly. I think there is a process education here in order to demonstrate to the people that are in charge of facilities in the big factories that by upgrading their system they can generate significant savings.

Eric Persson: That’s an area that we certainly look at as well. How to develop a system where our customers can switch to variable speed as opposed to mechanical means of changing speed. The traditional way was using belts and pulleys, and then later came the old SCR kind of drives.

But with modern inverters you can independently control both the torque-producing field, and the static field. And those are becoming more cost effective at this point so that not only at the industrial level, but even down into the consumer level you’re able to take advantage of that.

I think you have three areas that you can look at in terms of drives. One is, the motor itself, certainly permanent magnet motors are intrinsically more efficient than ac induction machines. I’m talking about brushless, and permanent magnet ac is a similar structure to brushless dc but just sinusoidally commutated instead of trapezoidally commutated.

On the inverter side, making more efficient and more robust inverters that are able to operator over a wide speed range, and particularly when you go to sinusoidally commutated motors, you can extract one of the gratest advantages of control—the ability to do field weakening, which allows you to, on the fly, change the effective motor constant, KE, so you can operate it at low speed, get good torque, and then drop the motor constant and be able to run it up to three or four times its base speed.

Energy Saving Forum

Jeff Bier (President, Berkeley Design Technology): That opens up or suggests another area which is worth mentioning. Once you get digital feedback control into these systems, whether it’s the motor or power distribution or whatnot, it provides possibilities to do all kinds of things.

Things, for example, to improve reliability, to detect failures before they become catastrophic, and so on. And a lot of this is based on digital signal processing theory and algorithms that have been developed over the last 50 years which are well established but have not been economical to deploy in large numbers of systems like motors in the past, but are now becoming much more economical as the cost of the hardware is coming down.

Electronic Products: That’s a good point, so that makes better, more efficient, motor drive systems much more attractive.

Ron Vinsant: The other thing I’d like to point out here is that one of the issues that prevented the increased use of sophisticated controls in almost all industries is who’s bucket of money pays for what.

When you’re initially deploying an industrial system like an industrial robot line, the people who are putting out the money for the robot line aren’t the facilities people. The planners for the production line generally are concerned with the capital cost of the equipment at the time the factory is being deployed and don’t necessarily look in any great detail at the long-term operating costs.

Stephen Oliver: Yes, it seems like the industrial market is three years behind the server market in terms of the facilities guy talking to the guy who actually buys the equipment.

Andrew Smith: It is possible for facilities to influence the buying strategy, because when you look at some of the other segments the facilities guys have enough input to change the strategy.

And you’ll see LED for architectural lighting getting to be widely used now because there’s very low maintenance costs associated with it. So I think it’s an absolutely a great comment that the cost of implementing a system gets formed by a different group, but there are processes for getting around that now.

Ron Vinsant: That is part of the change in business culture, and that is probably more difficult to overcome than the technical issues.

Eric Persson: Yes, and it is not only in the industrial area but also in the consumer area. One really great success story, I think, in terms of variable speed motor drives have been the efficiency of refrigerators. In a report from the California Energy Commission, they plot the energy used by refrigerators by year, starting in 1972, normalized to a hundred.

In 2003 it dropped down to about 25, so four times more efficient, which is a tremendous energy savings for the consumer. And yet, there was a Gallup poll back last year that asked consumers okay if you’re going to buy, let’s say it’s a $500 refrigerator, if you could pay $600 and have an energy saving refrigerator and you’d recover that extra $100 over one year, would you do it? And the majority answer was no. Consumers think only in terms of today’s dollars, today’s payment it seems rather than the long-term energy savings, just like the industrial customers we’re talking about.

Jeff Bier: Furthermore, they’ll put internet terminals on the refrigerator, which will only increase the energy used.

Bruno Baylac: That’s true. Between the capital investment and the operating costs, there are a lot of education issues, and there are different people running different Excel spreadsheets, I should say. The industrial area is moving, a bit slowly, but moving. These Energy Star inputs and information communicated to the consumer have a tremendous impact on the efficiency improvement.

Electronic Products: Which drive method is the very best?

Eric Persson: Well it’s sinusoidal commutation, or the motor type is a permanent magnet ac motor and there are two different variations of that, there’s external or surface permanent magnet and there’s internal permanent magnet and a slight variation of that is how the windings are arranged. There’s traditional layered windings or concentrated windings and each of them have some benefits to the designer.

One of those benefits, and one of the things that many drive and motor designers are trying to do jointly, is to design a motor where it acts as its own sensor. In other words, you can get rid of shafting coders and Hall effect sensors and all of that.

Back EMS sensing is used to get rid of shaft coders, some of those are protected by GE patents that I think are about to expire, but nonetheless, the methods are able to operate a motor in a more advanced means, which really simplifies the overall system.

Electronic Products: Okay, and this doesn’t compromise performance or reliability?

Eric Persson: Exactly.

Bruno Baylac: In my experience, there is no perfect solution. This is highly dependent on the application and I’ve mentioned before very different type of motors, which are eventually better in general torque, on low-speed or high-speed motors.

And this is really related to the application that you are targeting, but there is one common denominator, that is a trend to move toward vector control, closed-loop control, inverter control, and in order to have these capabilities and variable speed and I just think that the characteristics of the motor to the application by adding a flexibility due to the embedded intelligence.

Eric Persson: I completely agree with what Bruno just said, and to clarify on that, switching losses are never good, so if you can reduce your PWM frequency or even one PWM frequency per commutation cycle by varying the dc bus voltage, and that can be very efficient.

However, the tradeoff is now you don’t necessarily have fast dynamic response. So it’s true: it really depends on the application.

Electronic Products: Okay, so there’s no straightforward and easy answer, but still sinusoidal commutation is usually most efficient.

Eric Persson: I think you could argue that either brushless dc or sinusoidal commutation is the most efficient, because in brushless dc you have only two switches operating at any time, one of them steering, one of them PWM but at high current compared to in a sinusoidal drive where you’ve got all six switches, in PWM all the time.

It really depends. The advantage you get in the sinusoidal is that you can control the flux level more easily, so you can have a motor that at light load you’re not driving it with as much current. So you really have to look at the application and determine which method is best applied and is your goal efficiency or is your goal cost, or is it dynamic performance or extended speed range. There are a lot of variables.

Electronic Products: Jeff, do you see a DSP in the actual feedback loop? Is that very practical or useful?

Jeff Bier: Yes, I think it’s beginning to be done more and more and I think it’s going to become prevalent. The same kind of digitization that has swept through, let’s say, consumer electronics on the video and audio signal processing side where 20 years ago it was all analog, and pretty soon it’s going to be all digital. The same thing is likely to happen in a lot of motor control and motion control applications, because then you have the same benefit accrue.

Bruno Baylac: You have small digital signal controllers or digital signal processors, in production today in dishwashers, in washing machines, and in refrigerators.

Electronic Products: So you’re saying DSC is already in there?

Bruno Baylac: It’s already in there.

Eric Persson: As a matter of fact, it’s gone one step further. The difficult algorithms that do vector rotation and two to three-phase conversion, and so on, are now implemented in hardware, rather than the GP processor and you simply have a hardware engine doing the motion control.

Jeff Bier: The thing that is particularly interesting about this, is once you get digital CMOS functionality you know in the control loop, then Moore’s Law becomes your friend, which you know isn’t necessarily the case with power semiconductors.

Microprocessor and DSP guys are able to deliver more performance for less cost every year. And that means you can run more complex algorithms and increase your efficiency, functionality, and reliability. I think we’re just beginning to see the potential here that is going to play out over the next decade.

Eric Persson: The trend on the research side that I believe will be heading into the industrial drive area is that drives, instead of being functionally separate from the system, will be increasingly measuring their temperature, looking at overload characteristics and so on, and instead of just shutting down from over current, they’ll more elegantly power limit and communicate with their controller that it’s getting too hot, time to back off a bit, thus making the system more robust and more reliable.

Ron Vinsant: In the machine tool world, companies like Bently Nevada have been making motor controllers for CMC machines that monitor the current draw for constant feed on cutting tools and predict when the cutting tool is going to need to be replaced to maintain metal-removing rates.

So, this has been going on for a lot time and in the robot world, the amount of time of operation, the temperature of operation, and the torque that the motor’s having to put out for a given function, is used to determine wearing out mechanisms in bearing.

Electronic Products: So that will help with energy draw and reliability issues. Let’s talk a little bit about lighting applications and who would like to jump into that?

Bruno Baylac: We all know that lighting is also a major contributor in terms of electricity consumption and that the incandescent lamp will probably die in the coming, let’s say, 5 to 10 years and has already been banned in some countries.

The big lighting companies agreed that they’re going to stop production of the incandescent lamps sometime around 2012 or 2010, and that the new technology, like compact fluorescent lamps, will replace them. The next step is the LED, hybrid LED, which is a offering tremendous potential starting with architectural illumination. But there is a tremendous support in IT, I think that this technology is just starting, and could even displace all other lighting solutions in the coming 10 to 20 years.

Stephen Oliver: I completely agree with Bruno, I think LEDs are currently running at 100 lumens per watt, they’re going to get, maybe, double that in a few years time and the efficiency is incredible.

Ron Vinsant: At the last APEC, one of the gentlemen from one of the LED companies was talking about 212 lumens/W as being an achievable goal in the near future, and I would also like to point out that our governor here in California recently signed a bill banning the sale of incandescent, I think from January 1 2009 on.

Electronic Products: So the, going back to compact fluorescents, is there something that engineers could do to improve the drive of the these, or have they kind of gotten to as good as they’re going to get?

Eric Persson: There are a couple of technologies to look at, there’s high-intensity discharge and the fluorescent lamps. I think they both still have some room to grow.

One of the crippling factors for them is that they all rely fundamentally on a low-pressure mercury vapor, and the mercury, even though they’ve found ways to reduce the mercury content, that’s still a major environmental headache to dispose of that.

But on the other hand, one of the advantages of fluorescent and HID is that they’re reasonably high in lumens per watt, I believe they’re in the 50-lumens/W range right now, commercially available, and they can have much higher color-rendering index. In other words, a broader spectrum that gives a more natural light particularly for your work environments or for retail environments.

Electronic Products: More natural looking than LEDs, you’re talking about?

Eric Persson: Yes, or even other phosphors that are used in fluorescent lighting where you can get high lumens per watt, but it’s a very narrow spectrum. If you’ve ever gone out into a parking ramp that has sodium lighting and try to find your car, they all look gray.

Stephen Oliver: I think in Raleigh, North Carolina, they have gone to LED lighting for car parks, and things like that. And, obviously, that is a white light.

Andrew Smith: You also see a lot of lighting being used in things like traffic lights and those kind of areas where the cost benefit is about maintenance issues. I think LED color gambits are an issue, there are some folks who can improve the matching of color from LEDs and if you look at something like the TV backlighting, that’s just starting to move to LED based systems, which is giving very, very good color combinations.

I do think the comment that LEDs are 15 to 20 years away from general lighting is probably a very, very good observation. There’s still a lot of work to do with LEDs.

Ron Vinsant: There is a hybrid technology also out there where they use ultraviolet-emitting LEDs, in other words, blue LEDs, that are in the ends of fluorescent tubes and the tube has a phosphorous coating in it, and the UV light excites the phosphorous and emits light. And that’s an experimental technology, but one of the arguments for using that technology is that it is more efficient than fluorescent, emits a light very similar to fluorescent, and it fits in all of the existing lighting instruments that are already there.

Electronic Products: You still have the mercury problem with that, right, Ron?

Ron Vinsant: Apparently so, apparently there’s still some mercury to it.

Eric Persson: I wonder about that because the mercury, I mean it’s really how fluorescent lamps works is it gets the mercury vapor gas to discharge which generates UV, and the UV excites the phosphorous, that’s the traditional fluorescent lamp.

So if you’re using only UV from LEDs then maybe you can get away from the mercury, that’s very interesting.

Ron Vinsant: I thought I did read where there was still some mercury used for some reason, but I don’t see why, I agree with Eric.

Electronic Products: That’s very interesting technology. Anything else on the lighting front?

Ron Vinsant: I think there’s one other thing with lighting that it’s been promised for a long time and hasn’t happened, is where you can tell your lights to turn on to from your cell phone, for example.

And it will be very, very little incremental cost to add an interface that gives you that control. So that may be one of those things that if it were properly used and marketed, might compel people to want to change how they do their lighting. So again, a humanistic issue, we’ll have to see where that goes.

Electronic Products: Okay. Home appliance we talked about some, are the changes about done with appliances? Has everything been done with them or is there something more going on?

Ron Vinsant: One other thing that might happen with appliances is if you have a DSP in a dishwasher, you can look at the quality of the water in the dishwasher and only use exactly the amount of water that you need to clean the dishes and maintain the proper cleanliness.

Eric Persson: Yes that is right on, it’s the same thing with washing machines, and if you look at energy use of washing machines, it isn’t the electric energy that’s the big cost, it’s the hot water.

So by being able to use the appropriate amount of water and only change it when required, is something that can really help improve energy efficiency.

Jeff Bier: I would agree with Bruno that digital-signal-processing-based control has been deployed in high-end appliances. I think what’s changing now is as the chips are getting less and less expensive and more and more capable, we’re increasingly going to see them in lower-cost appliances, which are the ones that get produced in higher numbers.

Electronic Products: Okay. Mostly used for the motor control though, right, Jeff?

Jeff Bier: Right.

Bruno Baylac: There is more and more embedded intelligence in an appliance and we’re now measuring water level down to a one mm factor in order to optimize the amount of water you are using.

Electronic Products: So in those kind of appliances, are there any new ICs, new technology that’s just coming on line that engineers should be aware of?

Eric Persson: One of the areas that IR’s working on to simplify motor drives in appliance applications is to partition the system into a smart-power module and a control box.

And for systems where you have multiple motors, let’s say HVAC where you have a compressor motor and a blower motor, combine that functionality, use a single IC to control both motors, with each one having a separate power stage and current limiting.

That’s the direction we see is that you’ll have a functional power block, a control block and then it’s all run by the user interface through a serial bus.

Ron Vinsant: I agree with Eric. Fairchild has the similar SCM module product, and the idea that you integrate together power semiconductor technology in combination with front-end analog technology, sensors, and signal conditioning.

And then the combination you put into an ASIC, and you buy a DSP core and combining that with an ARM control processor.

Bruno Baylac: There is also the wirelessly connected appliance, connected to the electricity system in order to better manage the load and the overall electricity consumption.

Ron Vinsant: The California energy commission had, back in the 70s in the first energy crisis, developed a system whereby very large users of air conditioning could be controlled by the grid managers.

And this was done through a proprietary system of radio frequencies, and that was actually pretty successful and it’s still in use. The goal that Bruno’s talking about within California is to extend that so that you could put your clothes in your clothes washer and tell it to turn on, and it might not turn on then, but it would later, at a better time for the grid. And the grid would have computers that would manage this system, through a wireless network.

Eric Persson: I live in Minnesota where we’re not running our air conditioners right now, but we do have the switches, as a matter of fact the way it works here is our utility company offers us an annual rebate if they allow us to install a switch on our HVAC compressor, a remote control switch on your central air.

And they will duty cycle modulate it during peak demand, it will reduce your peak cooling a bit but it’s a significant dollar savings and quite a few people participate in that program.

Electronic Products: Okay. Let’s move on then to computingservers, and basic computing.

I have a note here from NEC. They were talking about desktop PCs that are going to become 80% more efficient, along with projectors used for presentations, laptop PCs, and fax machines. One of the big issues there that I know the Energy Department has been looking at a lot is the standby power, anybody want to comment on that.

Bruno Baylac: You’re right, standby power is a real issue. Some data has shown that in a typical residential configuration standby power could reach that 10% of the overall energy consumption. There is a regulation already in place in Europe called Code of Conduct, which is mainly targeting set up boxes or this kind of equipment where you have to take less than 1 W in standby.

Andrew Smith: That’s absolutely right. In laptops and PCs, when they’re running at much, much reduced performance need they still draw lots of power, and the challenge for the designer is to try and flatten out the energy efficiency of their system so that it’s as efficient at 25% loading as it is at 100%. And then they have to look at the standby power requirements.

Electronic Products: So is there a topology for switching power supplies that is favored for this situation?

Andrew Smith: The way we’ve done it is to use a multimode processor approach where the controller switches between different switching modes based on the load condition, and that gives very good performance across the whole power range.

Ron Vinsant: We find that there are different topologies for different power levels, but I think perhaps we cover a broader power level than Power Integrations.

So, we agree that multimode is certainly the way to go, that might even include, for example, the power factor correction we talked about before, but we have a newer topology which is called LLC that uses two inductors and a capacitor, and it’s a resonant topology that, for example, at full volt and 15 A in a point-of-load system, has efficiency of 92% to 94% without synchronous rectification. And, it has a very flat efficiency across the load range.

Electronic Products: Anyone else using LLC or is that yours exclusively?

Ron Vinsant: No, no, there’s a couple of other people using it. It’s being used a lot in LCD and plasma TVs. They tend to be large, especially plasma TVs, power users.

The other thing I’d like to comment on is servers, especially on the large end, as in server farms, the larger server farms have had an initiative now to go from ac distribution to dc distribution. Now there’s some issues with doing that, but the main reason for doing it is that the savings are just huge.

Electronic Products: What sort of savings?

Ron Vinsant: In a current system, overall system efficiency of an ac system is about 61%, and a dc distribution system is around 85%.

Ron Vinsant: Because of the number of steps that the power is processed at, there’s far more steps in the ac systems.

Eric Persson: One of the EPRI studies, actually it was by Annabelle Pratt from Intel, and she was talking about data center efficiency and how it relates to servers and that the overall efficiency is about 48%, which is quite low.

Ron Vinsant: That’s after you take into account the HVAC which we hadn’t gotten to yet. And incidentally that wasn’t just Annabelle. She started it, but there’s 21 organizations that help sponsor this, and one of them is Fairchild.

She did a wonderful job of modeling in a spreadsheet about how this all works, and the real issue is the advent of super server farms. There are server farms being built with 500,000 deployed servers in one location. Whole hydroelectric dams are going to run the one server farm. So when you talk about saving just for example 28% in power consumption, that’s huge, but when you add onto that the HVAC load and you talk about an additional 50% savings, it’s just dwarfs the capital cost issue.

Eric Persson: The issue with the global power distribution, of course, is that you have I/R losses in the distribution system, and when you’re talking about processor voltages that are subvolt you can work on getting 0.1% more efficiency in the transmission and distribution system, but really the majority of the losses occur in the last few inches of the whole power distribution chain.

By being able to go with a higher-voltage dc bus voltage all the way up as close as possible to the load, that can create significant savings.

Electronic Products: But you’re still using a dc bus rather than an ac like Ron was talking about.

Eric Persson: Dc bus, yes.

Electronic Products: So again, the savings there is not having to do the ac conversion all the time everywhere, you just do it one time up front, and do the power factor correction, and then distribute dc.

Eric Persson: Yes.

Stephen Oliver: Talking off on the Lawrence Berkeley and Annabelle report and passing off 380 V or a high-voltagethat kind of voltage has already being used in large mainframe systems I would say for about seven or eight years.

Ron Vinsant: It’s longer than that, the cyber 70 systems from the 1970s that Control Data used the same overall technique and Tandem, back in the 1980s, they did the same thing.

Stephen Oliver: I’m just thinking about the IBM power 6 series, they run on a 350-V dc bus.

Electronic Products: In the data center, though, if you distribute dc there in higher voltages, does that mean that we need some new chips that aren’t necessarily available now that operate directly off of these higher voltages?

Stephen Oliver: No, that’s what’s interesting, for example 380 happens to be approximately the operating voltage of a PFC.

Electronic Products: But I’m thinking about a dc/dc converter, there’s not a lot of 380-V input devices.

Stephen Oliver: Those parts exist today and are being used in production, we have a series of converters that go from 380 down to 12 in a square inch, at 300 W. If you can distribute 380 and then step that down to let’s say 48 and then go down directly to 0.8 V, that’s the most efficient way to do it, so you carry the 48-V rail.

Electronic Products: But now you have three step-downs that you’re doing, so those efficiencies are all adding up.

Stephen Oliver: Well at the moment you have, let’s say, from 380 V in a traditional system, you usually go from 380 down to 48, then there might be a 48 to 12, then you’ve got a 12 to 1.

So you’re not adding any stages actually all you’re doing is redefining the blocks in the system.

Ron Vinsant: In the telecom world, that’s an accurate description. In the computer world it’s the ac comes in, it’s converted to the PFC voltage which is . . ,

Electronic Products: Ron, we’re talking about for server farms here?

Ron Vinsant: Well we’re talking about just servers in general.

Electronic Products: Okay.

Ron Vinsant: And even a lot of desktops. So you come into the PFC with whatever your ac is, you convert to 380 Vdc. The 380 Vdc then goes into one form of a forward converter or an LLC, whatever, and then you get a 12-V output, which is the point-of-load bus, and then the 12-V goes into different forms of converters to either your main circuity.

Stephen Oliver: I would say that’s not the most efficient way to do it. If you go from 380 to, it’s about 8.7% more efficient if you go from ac, so from the 380 V, to 48, to 1 V, than if you go from 12 V and then down to 1 V. And most of that is about to the topology, but a lot of it is about that distribution loss at the 12-V point.

Ron Vinsant: Yes, on a backplane though the 12-V loss is hardly anything in an eight layer board.

Stephen Oliver: No, it can be quite substantial. We did a presentation at the Super Computing show in Reno a couple weeks ago and I can share that data with people off line if you like.

Electronic Products: Okay, let’s go back to what Ron was were saying earlier, and maybe I’ll ask Eric, are there converter chips available now that work in the 300-V range?

Ron Vinsant: Yes.

Eric Persson: Yes, absolutely, and whether it’s going directly from 380 or 400 down to 48 or down to 12 as Steve said, it’s being done right now, today.

Electronic Products: Okay. Do you see that as a forward trend?

Stephen Oliver: Yes.

Ron Vinsant: Yes.

Electronic Products: Okay, good. And Andrew . . .

Andrew Smith: Yes, I mean 380 Vdc is pretty much what you’re seeing on a rectified high line anyway, so it’s what everyone’s going to see.

Electronic Products: So, what else in computing, any big trends that you see?

Andrew Smith: One of the interesting things is controlling peripherals for PCs. If you look at things like DSL modems, they might be 8 to 15 W, which is not a lot on an individual basis, but there are a lot of modems out there. Generally speaking those devices today are not turned off at any time, they have no standby mode.

So you things like some studies in Australia talking about getting a PC to indicate to its modem that it should shut down. And I think we’re going to see more motion in that direction.

Bruno Baylac: There is also another technology that we didn’t mention here, and this technology could have some impact in terms of energy efficiency. That is power-over-Ethernet, which may be powering a lot of equipment and optimizing the efficiency may have a significant impact. Of course, the power is limited to 15 W, and high power is going to 45 W, so it’s a limited opportunity.

Andrew Smith: The development of the PoE, which I think is 803.3AT, is looking at very carefully describing the amount of power that each peripheral is going to use based on that 48-V bus. And the excitement for the server manufacturers and the folks sending the power down is they can reduce the size of their power plant, so that they don’t have to over design their power equipment.

And that’s quite exciting for someone like Cisco who’s using a very large number of potential loads on their PoE systems, but the existing PoE information system standard has very poor granularity. So that’s going to be a big improvement in the system. And you’re right; it is higher power. I think 45 W might be a little bit high; I’ve seen a load comment at about 25 to 30 W.

Ron Vinsant: The other thing that PoE has done is since there is a power constraint on what you can get out of a PoE port. It has forced the design engineer to very carefully think out his power management strategy on those products.

Andrew Smith: That’s a great point.

Ron Vinsant: The other thing I would like to point out is that there’s some things coming along in the dc/dc realm. Fairchild, among others, is starting to integrate the controller and the MOSFET altogether. But it’s not a monolithic process; it’s where we use the best of our offset process is in combination with the best of our controller processes.

A lot of parasitics are reduced, the layouts are simplified considerably for the design engineer, and the overall efficiency of the dc/dc converter is raised by a few percentage points.

Eric Persson: Yes, I agree that’s the trend and I think our vision looking into the future is not only power switches and drivers and control, but, as frequency goes up an order of magnitude or two, you’ll see the capacitors integrated as well, so that you end up with the entire product on a single chip. And, that can be mounted right underneath the processor.

Electronic Products: Are we talking about a half bridge here?

Ron Vinsant: No, there’s generally synchronous buck conversion, right now.

Stephen Oliver: But also there’s a thing called assigned amplitude converter which is a full bridge resonant synchronous rectified solution which goes right at the load itself.

It also enables the removal of bulk capacitors from the load.

Electronic Products: And does that work as well down at low loads?

Stephen Oliver: Yes, it’s very very flat from about 15% up to about 100% load.

Eric Persson: Half bridge is very fundamentally a high-side switch and a low-side switch, so I kind of consider the synchronous buck is like a half-bridge; it’s not operating at resonant, it’s operating where you have zero-voltage switching on one side, and current flows in one direction.

Stephen Oliver: The problem with the synch buck though is you try to keep the distribution level with as high voltages as you can to minimize I2 R losses, but as processes dip below a volt, the duty cycle of the synch buck becomes very, very difficult to do. So there’s a topology limitation there.

Ron Vinsant: We’ve sort of over come that, because, you know it’s not unusual to find, I’m sort of jumping the gun here, but let’s just say you’ll see much, much higher frequency converters in the future.

Electronic Products: What frequencies?

Ron Vinsant: I would predict, let’s say, tens of megahertz.

Electronic Products: Everyone agree with that?

Eric Persson: Yes, I agree, I think even 50, 60 MHz.

Electronic Products: And the reason is, that with this integration you’re talking about you’re able to control the parasitics so much better?

Ron Vinsant: It is what Eric said before. Increasingly our customers are not schooled in power design, and it’s much better if the inductor, the capacitors, and the power switches are all in the same box. There’s modularization, but in a way that you haven’t seen in the past.

Eric Persson: It’s improvements in switch technology, and just what Stephen was saying earlier about the duty cycles. I mean trying to go from 48 V down to 0.6 V, for example, the duty cycles are incredibly small. So on the control set, the losses are completely dominated by the switching loss.

And on the synch FET it’s all the static conduction loss and it’s pretty difficult to get high efficiency because you’re output’s only 0.6 v to begin with now have to get your synchronous switch conduction losses down to the millivolt level. So it’s different switch technologies and different driver technologies, along with integration of those into a single package to eliminate parasitics.

Electronic Products: And that’s driving higher frequencies?

Eric Persson: Yes.

Ron Vinsant: And one other thing, do you know why are we talking 0.6 V? Because the 45 µm processors are coming and their Vmax is 0.9 V, and their suggested operating voltage is 0.6.

Eric Persson: Back 15, 20 years ago, the MOSFET was the ubiquitous power switch. Now you go onto any of our Web sites and you will see thousands of FETs to choose from and different drivers, and increasingly our customers aren’t necessarily experts. They’re expecting us to provide them a solution that meets their particular application requirements, to do their homework for them, provide them with the right driver, the right switch for their application.

So this whole idea of packaging the appropriate switches and drivers and really having that become the single part is the future. It will be a selection of those, rather than the individual FETs.

Andrew Smith: I think you’re absolutely right, the ability of engineers to design power supplies is probably not what it was 20 years ago, so you’ll see a continuing trend with the manufacturers of the components providing more and more of that function as we move forward.

Stephen Oliver: Obviously, there’s some new semiconductor material work out there which enables these kinds of high frequencies, and over life we’ve gone from making things work at 50 Hz, to kHz, to MHz, so obviously there’s a trend to that.

Definitely, the power component paradigm, like the Lego brick. You go back to say ’84 with the introduction of the first fully 48-V input dc/dc brick by Vicor as being a power component which then meant that engineers didn’t have to design that particular chunk.

Electronic Products: So the trend is high frequency, high-voltage input and low-voltage output?

Stephen Oliver: Yes.

Electronic Products: The ICs that are taking this 300-V input, are they FETs or . . .?

Stephen Oliver: They are, if it’s a traditional topology like a forward converter or a half bridge. The control IC tends to be low voltage, controlling a high-voltage FETs with different buffers and transformers.

If it’s a VI Chip technology, that’s a multichip integrated module, again the same philosophy though, low-voltage control ICs driving high-voltage switches.

Electronic Products: And those switches are MOSFETs rather than IGBTs?

Stephen Oliver: Yes, basically when you talk about these frequencies, it’s going to be a MOSFET. However, we have to be very careful when we talk about a MOSFET, when we go to new technologies, because they are different.

Eric Persson: I think Stephen hit it right on the head. We can make either low-voltage or high-voltage ICs and really it gets down to which of them is best suited for the particular topology we’re making. On the switch side, at the frequencies we are talking about, it’s not going to be IGBTs.

Really the IGBTs are great for motor drives and for some induction heating and resonant kind of applications, even resonant, and even up to maybe 100 kilohertz. They’re much more typically used in the 20 to 40-kHz range, just above the audible.

Stephen Oliver: To comment on in terms of power level load, when it comes to the lower-power levels, maybe speaking for Andrew here, but the Power Integration technology is a monolithic high-voltage unit.

Andrew Smith: That’s correct. Actually it’s a combination, it’s high voltage and low for monolithic. So it depends on the timing approach and what you want to do.

Ron Vinsant: I agree with Eric, we have 600 volt HVIC processes, so a high voltage integrated circuit, and we do things completely on that 600 volt process including control functions.

Andrew Smith: So different ways of cutting that cake.

Ron Vinsant: Very often it is better to use, for example, external FETs because the FET process is preferable to the HVIC process for doing high current switching. So, and in some instances a low voltage IC process is better for example doing drivers.

Eric Persson: Yes. And you know we both our companies have products where rather than going with a monolithic approach which seems to have some advantage being a single die and all that, when you make FETs using an IC process, you end up with a lateral FET which takes up a lot of acreage to get the same job done.

So the cost tradeoff becomes more dubious and, in fact, it turns out your probably better choosing an IC process to develop the controller driver and choosing an optimized MOSFET process and then co-packaging all that into a low-cost package.

Andrew Smith: Yes, that’s certainly some good comments there. I think the other side of that, of course, is with a monolithic design you have more access to different points on the silicon. So there are some other things you can do to some of the control functions that are easier and more comprehensive if you use monolithic approaches.

What it comes down to is, depending on power level and depending on the cost target of your customers, there are different ways of doing that technology.

Eric Persson: I’m not trying to bash Power Integrations, because you’re really covering a little bit different market, and for high voltage, I think what you’re doing makes a lot of sense.

I was talking about IR and Fairchild with the lower-voltage devices.

Electronic Products: So what about the rectifier? Mostly we’re doing synchronous rectification. What about silicon-carbide rectifiers, you guys using those at all?

Ron Vinsant: Silicon-carbide has both strengths and weaknesses. The main thing about silicon-carbide is the EMC reduction that you get, the EMC footprint, especially in power factor correction. The high forward-voltage drop hasn’t been the issue so much as the surge-current ability. They’re relatively weak in that sense, compared to conventional ultrafast diodes.

Electronic Products: So somewhat delicate?

Eric Persson: Yes. The performance is outstanding. They’re really nice parts, especially for hard-switch high-voltage applications. You get this Shockley diode with no reverse recovery.

But I just don’t see how they’re ever going to get the cost down to anything we consider reasonable. The wafer costs alone are several orders of magnitude higher than conventional silicone wafers.

Ron Vinsant: They can help efficiency too, and one thing I’d like to point out is that if you compare a high-voltage ultra-fast-technology diode, to the conventional diodes, and to the silicone carbide, the costs aren’t comparable, that much is true. But if you carefully look at your design and look at the reduction in the EMC filter that you can get, I think you could make a reasonable argument that the cost is a wash.

Eric Persson: Yes, that’s a good point, Ron, to the future, it depends on Cree, as the major player, they dominate the world in making wafers for silicone carbide and they sell their wafers to other vendors. But I believe down the road there will be other switch materials that will allow for high-voltage synchronous rectification, so it isn’t really even a diode, it will be just like doing a synch buck, only now you can do it at 600 V.

Electronic Products: Yes, and Andrew, the high-voltage FETs that we’re talking about. If we’re working off the 370-V bus, do they still have pretty good on resistance,?

Andrew Smith: Any kind of lateral technology is not going to be quite as efficient as a trench process for high voltage, but the higher the voltage the less likely you are to use trench, I suppose.

It depends on the power levels, because where Power Integrations is in the segment, we’re looking at lower power, which means it’s more of a balance between switching and conduction losses. So as far as MOSFET on resistance, in some cases you can make the MOSFET too big you’ll lose efficiency because you’re losing more due to switching transition losses. So it really depends on your power level on how dominant on-resistance becomes.

Ron Vinsant: Generally, the reverse-recovery losses are so dominant in the diode that the forward-voltage losses, conduct losses, kind of don’t matter that much.

Electronic Products: Especially higher frequencies?

Ron Vinsant: Especially at higher frequencies, exactly. Now on the other side, on the MOSFET, there are vertical processes that we have our supreme MOS and there’s Cool-MOS from Infineon and you can get 600-V devices that are sub-100 mohms, but the capacitance is high.

Eric Persson: What’s called Super Junction, or Cool-MOS, and there’s various other brand names, but those processes allow you to get in a small area a very low RDS(on) , compared to traditional FET process.

But the other penalty, is just what Ron was talking about, that the diode reverse recovery of that process is very poor, so they are well suited for things like PFC, or for zero-voltage switching, like in lighting ballasts, for example, but not for hard switching—talking about inverters.

Electronic Products: So going back to the top level here, we’re talking about saving energy, but because we were already at 95% efficiency, so there is not a lot of room to make great strides. I just mean in general, we have power converters now that are 95% efficient, so we’re not going to get much better than that, right?

Ron Vinsant: Okay, let me put it this way. I was at an Intel Developers Forum, and one of their pathfinder groups was giving a presentation on why efficiency was becoming so important.

I asked about the fact that, historically, the only thing people are willing to pay for was dollars per watt. And that now was I being told that people would pay for efficiency points per dollar? And I was told yes. People are now willing to pay good dollars for efficiency.

Eric Persson: To them, going from 75%, 80% efficient converter topology to something where they can get into the 90% range is a huge savings.

Stephen Oliver: However, they are schizophrenic. You talk to the development guy and he said yes, I will pay for efficiency. You talk to the purchasing guy, and he denies it exists. ■

Advertisement



Learn more about Electronic Products Magazine

Leave a Reply