Boost vs RPM vs Voltage vs Current

AlexLTDLX

Administrator
Staff member
With the help of you all and outside sources I'm finally starting to make sense of my first drag test datalogs. And it's not necessarily working how one would logically think. Typically, you'd think that more voltage would also require more current and more power to get more RPM (and therefore boost). But it seems that less voltage is requiring more current and wattage to produce LESS rpm and and boost.

Let me explain. Since we had the "happy accident" on our second pass of leaving hard enough off the transbrake that one of my packs disconnected itself; basically leaving us with 1 16s pack instead of two 16s packs in parallel, that gave us two sets of data points to work off of. Now, I realize that the second pass was much quicker (ET wise) but not much different speed wise (MPH); that's mostly due to leaving on less boost on the first pass and more wheelspin.

First, let's look at the EFI datalogs so you can see what I mean.

First pass (2x 16s packs in parallel):
1stPassBoosteddatalog.jpg

The white trace is engine rpm, the red trace is boost and the green trace is throttle position. You can see the rather large amount of wheelspin at the top of first gear and in the bottom of second (which, with a power glide is more like 2nd or 3rd gear in a "normal" car; and on slicks no less) - if there was no wheelspin, that trace would be smooth up to the shift. That accounts for the loss in ET, but shows we're making a good bit of power. But you can see that peak boost is 6 psi and average boost is 4.765; even though we left the line at only 2.4 psi.

Now let's look at the second datalog:
2ndPassBoosteddatalog.jpg

Here you see much less wheel spin in 1st gear, and less boost - peak of 5.9 and average of 4.514; even though we left the line at almost peak boost (it was probably closer to 4 psi, but still much more than the 1st pass - again, giving us almost a HALF SECOND improvement in ET, even though we were actually making less power. This is the pass where we had a battery pack become disconnected on the launch.

Now on to the interesting bit... I'll put it in the following post just to keep things manageable.
 
Last edited:
So, based on the graphs above, I'd think that the first pass saw more voltage (less sag due to having 2x 16s packs in parallel), and should've seen more current and wattage. But that's not the case, according to the ESC datalogs.

Let's first compare the eRPM (and actual impeller RPM of the compressor - to get impeller RPM, simply divide eRPM by the number of pole pairs in the motor - in this case, it's a 6 pole motor, so there are 3 pole pairs - so divide eRPM by 3 to get impeller RPM).

1st pass:
Screenshot (49).png

2nd pass:
Screenshot (59).png

You can see that the impeller was spinning faster throughout the runs in the first pass; in fact a lot more of the second pass was under 80,000 eRPM - 80,000 eRPM being 26,667 impeller RPM. Which makes sense, when you look on the right hand side of the datalog screen shots under "Minimums," you can see the first pass had a voltage minimum of 50.6 volts and the second pass (with the disconnected pack) had a voltage minimum of 48.31 volts. So far, it all makes sense, right?

Here's where it gets bit odd. Look under the "Maximums" area on each datalog - the 2nd pass where we saw less boost and impeller speed was actually pulling more current - A LOT more current - 285.72 amps vs 241.09 amps for the first pass. And wattage follows suit - 13,802 watts for the second, less boost pass, vs 13,157 for the first, more boost pass.

So now the question comes, why is that? I'd think the first pass would've pulled more current and therefore wattage as well.

Let's look at the motor duty cycle for the first pass vs second pass.
First pass:
Screenshot (52).png

Second pass:
Screenshot (61).png

While at first glance, they seem to be pretty close, if you look carefully, you'll notice the duty cycle on the second pass is on average higher than the first pass.

Now here's where I'm starting to guess, and other theories/inputs are appreciated. It seems that the closer we get to 100% duty cycle, the more power the motor will draw - it's getting less efficient. And the conclusion I'm drawing is that more voltage buys more "headroom" as you close in on 100% duty cycle. SO... I'm gonna run with the notion that even more voltage (which I'm pretty sure my new battery pack will provide), will give us even more impeller RPM, and therefore boost and power, all while drawing less current. Of course all of this is "up to a point" but we don't know what that point is yet. But we'll probably find out soon enough.

Thoughts?
 
I dont think the runs are comparable because the load on the E-turbo is not the same and therefore the results are not directly comparable. Im most interested in the RPM which differs greatly between the two runs. This is affecting the back pressure applied (LOAD) to the E-Turbo and therefore it's not apples with apples.
EDIT . More load to the Compressor and eclectic motor is a less restrictive compressor outlet. Higher engine RPM is more flow and therefore this is a higher electric motor load.

You'd expect that an increase in impeller RPM would require more POWER to do so.. That this largely a true statement BUT that is only true if the load on the blower is the same. If it's changed then you've changed a fundamental parameter and the results are not comparable.

Example.. Take a vacuum cleaner and block it , you'll find that RPM increases while vacuum increases However POWER will REDUCE to do so, not increase. All of this will happen while appling the same input voltage.

As a motor increases in RPM so does back-emf and thus current is reduced because the back-emf is countering the batteries input voltage. so max power is not based on pack voltage.. it's pack voltage - Back-emf
. This is why motors pull Huge amps (and therefore power) at 0 rpm while at max theoretical RPM they pull hardly anything... Because Back-emf has almost reached the same voltage as pack voltage so current and power is now tiny.

Now why does the RPM increase i believe at some point the impeller is in a stalled state and load has actually decreased allowing for a greater rpm despite the reduced power.

If your above test was a bench test with a static backpressure / flowrate applied then im sure your results would be as expected.
 
Last edited:
Ahhh.... but I really don't think the load on the motor was much different - it was making almost the same boost (within 2%), but power changed (the opposite way) by over 5%. And this is with RPM and power going in OPPOSITE directions (more rpm/boost took less power - I don't see any way that more RPM and boost could result in less load; while engine HP stayed at least the same, if not a little more in the first pass). If you can think of how that would be, then I'd agree with you. That's not a challenge or anything, I'm just trying to figure this out. I supposed you could say that the wheel spin meant less load, but then it would've accelerated faster requiring more air, giving less boost. But the exact opposite happened.

The general consensus from Steve Neu and Thomas over at Castle as well as the engineers at APD is that the ESC is throttling for some reason (kind of obvious, but it's throttling more on the first pass - where it actually made more power, so the load on the motor was actually HIGHER - thus the increase in boost) - I still don't have a clear picture why, but I'm guessing it's voltage related.

There's only one way to find out for sure, and that's my next round of testing. BUT, if it's true, we can get close to 750 hp at about 15kW (just prognosticating, based on the data) - which still means there's plenty more headroom in the ESC, but the motor will be tapped out. If we by some freak of nature don't hit 9's and the TP Power motor is tapped out, I'll be hitting up Steve for one of his motors. Somehow, though I don't think that'll happen. But again, solid 9's are the limit for this car - I technically can't race it that fast; I'd need a full roll cage. Once that point is hit, then it's time to make a BIG eTurbo and go for really big power.

But right now, until we get it on the dyno or back to the track, it's really all theory anyway. Who knows? It's possible that if we strike the right balance of voltage/duty cyle/current with this unit we might actually exceed the expected power.
 
yes ... it certainly is an "optimization" problem ... with too many parameters.

First we need to fully understand "the model", then start tweaking one or two parameters to ultimately get maximized boost (over engine RPM range/consumed CFM!) at lowest possible average input power (in kWH).

Alex ... you have done all these data logs with the "old ESC" in place ... and previous poster has a point : the LOAD on the impeller needs to be comparable over time too!

So lets take a break and think if we really will get the right answers from a static pressure test. I would say we need to simulate the engine air flow / consumption in an controlled environment and then watch the development of boost playing with the electrics... 🤔
 
That's the problem - static pressure testing is almost impossible to simulate in a working engine; at least for us. Dyno testing is probably the closest we can get, and I'm trying to figure out how we can do multiple tests (different voltages) on the same dyno test. I can charge these cells really fast (15c rate), so maybe we start at say, 66 volts and do a pull, then 68 volts and do a pull then 70 volts and do a pull.

The problem is it would take a monster supply to do it quickly; and even then, it'll slow down as we close in on the test voltage (the crossover point of constant current to constant voltage). Once I finish these packs (hopefully in the next few days), I'll charge them to 66 volts to see how long it takes and maybe we can get a feel for how long it takes. I may have to just do two - 68 volts and 70 volts. Because I have another trick up my sleeve for more power that I want to test on the dyno too. But I can't tie up Ray's dyno for a whole day. I could maybe get away with half a day.

BTW - these data logs aren't from my old ESC - those chineseum pieces of garbage don't data log. So that part at least is apples to apples.
 
There is also the situation where a warm Lipo can deliver more current than a cold one .. so that might be playing a factor.. here .. With out a dyno and alot of testing who's to know.. Personally i wouldn't worry about it... Alternatively if you pulled it off you could do some bench testing which we would ALL REALLY ENJOY.
 
That's the problem - static pressure testing is almost impossible to simulate in a working engine; at least for us. Dyno testing is probably the closest we can get, and I'm trying to figure out how we can do multiple tests (different voltages) on the same dyno test. I can charge these cells really fast (15c rate), so maybe we start at say, 66 volts and do a pull, then 68 volts and do a pull then 70 volts and do a pull.

The problem is it would take a monster supply to do it quickly; and even then, it'll slow down as we close in on the test voltage (the crossover point of constant current to constant voltage). Once I finish these packs (hopefully in the next few days), I'll charge them to 66 volts to see how long it takes and maybe we can get a feel for how long it takes. I may have to just do two - 68 volts and 70 volts. Because I have another trick up my sleeve for more power that I want to test on the dyno too. But I can't tie up Ray's dyno for a whole day. I could maybe get away with half a day.

BTW - these data logs aren't from my old ESC - those chineseum pieces of garbage don't data log. So that part at least is apples to apples.
oops ... I have not looked at the APD software window titles 🙈
 
Yeah, I had initially chalked it up to warm lipos; but there's too much that doesn't make sense with that as the only cause - like the motor pulsewidth logs. But that may still be a factor too. LTO cells don't have that effect; and certainly not when we have as much continuous draw power as we now have. The smallest cable in our power path is now 4 gauge for probably 6 feet. Everything else is now 2 gauge or bigger. Before, the biggest cable was 4 gauge, and the smallest was 10 gauge (from the batteries to the EC5 connectors). Basically, this current power setup should be overkill for anything I'm going to run on this car. If I had done that from the start I probably would've saved well over a thousand dollars. But when we started, we had no idea if it would work at all and what the power requirements would really be.

If I was smart, I would start selling off the old power setup. I bought a lot of Lipo chargers I now won't be using, unless I re-purpose them for something else.
 
Top