• Status Closed
  • Percent Complete
  • Task Type Patches
  • Category Operating System/Drivers
  • Assigned To No-one
  • Operating System Another
  • Severity Low
  • Priority Very Low
  • Reported Version Version 3.2
  • Due in Version Undecided
  • Due Date Undecided
  • Votes
  • Private
Attached to Project: Rockbox
Opened by FlynDice - 2009-06-17
Last edited by FlynDice - 2009-06-30

FS#10344 - AMSSansa Dynamically adjust core voltage to extend playtime

This patch reduces the core voltage CVDDp setting from the default 1.2 V to 1.05 V when the processor is running at less than 200 MHz. There are several notes in the as3525 datasheet which address lowering to 1.1 volts when running at less than 200 MHz and we spend the majority of our processor time down in the 65 MHz range (at least with music) so I thought I would try that and see how it worked and have had no problems yet. If you select a maximum processor speed above 200 MHz the adjustment happens during the boost/unboost function. If you select a maximum processor speed below 200 MHz it simply makes the adjustment during system_init and leaves it there. I also added an item on the View HW info page. Right beside the mmu indication there is a CVDDp indication that will show “high” or “low”. I ran a battery bench on this with current svn default of 248/62/62 and got just under 18 hours on an e280v2. Bench and graph attached also.

Closed by  FlynDice
2009-06-30 17:59
Reason for closing:  Accepted
Additional comments about closing:   Warning: Undefined array key "typography" in /home/rockbox/flyspray/plugins/dokuwiki/inc/parserutils.php on line 371 Warning: Undefined array key "camelcase" in /home/rockbox/flyspray/plugins/dokuwiki/inc/parserutils.php on line 407


have you measured time needed for switching the voltage ?

you could read TIMER2_VALUE before and after, and make the difference (1500000 decrements per second)

The only thing I measured was runtime.. ;). I’m not sure I understand why we need to know the time it takes to switch voltage. Is this a software or hardware concern?

well cpu boosting is needed for critical codes, so we must no lose any performance

Setting the bit is probably fast enough.

BUT I’m not sure how long it takes for the voltage to be really adapted. Boosting without waiting for the volts to reach its maximum might be dangerous.

Can we even find this anywhere without taking one apart and physically measuring the voltage with a meter? I can’t find anywhere that we can read the actual voltage to make the measurement and see how long it takes to come up, we just have the ability to set this value. It would seem that the concern being raised here is that we should delay for the voltage to come up to the full 1.2V before boosting and that this delay would impact performance. There should not be any issues with lowering the voltage for unboosting. Am I understanding this correctly?

I cannot offer any better claim besides the fact that this patch has handled everything I have tried it with including mpegplayer and timestretch without hiccupping at all.

Changing the voltage takes ~360 µs (measured ascodec_read+write) so i think it’s fast enough.

kugel if the voltage is not changed rapidly enough the frequency is incorrect ?
saratoga said he could do measurements on his e200v2 if needed : I think it’s is the CVDD pin which is bottom right on the bga

Also the datasheet says below 200MHz the VDD_CORE can be at 1.10V, but we set it at 1.05V (1.10V is “typical operation”)

Note on page 119 there is the power up timings and they mention 10ms for “charge pump (50mA) enable unlock, enable via I2C setting” but i’m not sure if that applies to changes when the system is already up

I can’t actually get at the BGA pins though, they’re under the chip and only a few hundred microns wide. Theres also no trace on the CVDD on the e200v2.

Regarding CVDD, “7.4.20 10-Bit ADC” says that the ADC can measure CVDD (source 0×0100). Perhaps it would be enough to start the voltage change and then immediately read CVDD?

Ok I think I’ve got a reading on this using this function: I get a 635 delta on TIMER2_VALUE to go from 1.05v up to 1.20v which I believe comes out to ~423 µs. Would someone please verify the method and result.

I have verified that I am reading CVDD and constructing the value correctly by displaying it and observing the expected changes while operating. I wonder if the ascodec_read function significantly impacts the value of the difference though.

When I looked at my test function again I realized that I really don’t need to read ADC_0 to check on the voltage change, the bits in ADC_1 are sufficient so I used this function: to find a delta of 317, about half of the previous delta with half of the ascodec_reads. This would point to a time of ~212 µs but I think what is really being timed here is the ascodec_read function and not the time for the voltage change. Even so, I believe this does tell us that the time for the voltage change is ⇐ 212 µs.

That seems short enough to get away with a little delay.

so can this be committed, or is there anything to clear before?

just thought while talking with saratoga : can’t we block until the voltage is correct?

I think it takes longer to read the voltage than it does for the voltage to change though. So instead of blocking until the voltage is correct wouldn’t it be better to go with just a small delay before boosting the frequency?

I think we better play safe. Also, 300us seems like nothing to worry about.

How much time exactly does it take to read the voltage?

Blocking would be 100% safe, waiting with a small delay could break anytime so I’m definitely for blocking as well.

Just asking about the 1.10V vs 1.05V thing.

The datasheet isn’t clear about that. I find a mention of a 1.10V (page 12), a mention of 1.08V (page 12) and a mention of 1.05V (page 181 und 188.

It definitely says the CVDD can be programmed for 1.05V (page 181), so I think we should go with that one. Maybe the hardware doesn’t exactly hit the 1.05V. It actually seems to me that both 1.05V and 1.10V result in 1.08V.

Page 188 is for the BGA 144 pins version but I think we use the version with 224 pins (seeing the pictures of disassembled models).

Page 180 (of datasheet v1.13) I see “K1 vdd_core P P 1.2V core power supply”

Page 12 quotes:
“For normal operation with fclk (CPU ARM-922T clock) frequencies
below 200 MHz, CVDD (supply of VDD_CORE) can be set to a lower value of 1.10 V. Only for setting
fclk of the CPU to clock frequencies above 200 MHz, the VDD_CORE supply voltage must be set to
1.20 V typical conditions.”

So to respect the requirements we should not use a voltage lower than 1.10V

Page 181 is for the 224pin version (”w/ external memory), and it says the CVDD can be programmed for 1.05V. I.e. we’re not doing anything wrong. It’s probably another question what voltage really results (I guess 1.08, as page 12 says).

Ah I see it now (pin R15).

You can see page 13 the minimum/typical/maximal voltage for each setting, i believe the requirement is “1.10V setting”, not the actual exact voltage output.

Alright, the note on page 13 is pretty explicit (bad thing it’s so hidden), the settings which allow for <1.10V must not be used as the as3525 isn’t specified for that.

So, yes, we should stick to 1.10V (although I don’t think it would be a problem, too low voltage generally isn’t, especially not harmful).

It even says 1.10V setting must not be used, and we must set 1.15V to get a minimal 1.10V.

FlynDice could you measure again battery life with 1.15V setting, because the maximal voltage is 1.30V (superior to 1.24V) ?

Where does it say that?

The CVDDP description page 13 says “see Note (1)” for both 1.05 and 1.10 settings.

Note (1) is the one you mentioned at 15:38 GMT+1

Alright, can we see what the OF sets CVDD to?

Wow, this is heating up pretty quickly… I’ve read all the references being tossed around here(before today even) and I think the one that really applies here is the one on page 12 . The way I read it this explicitly states that at frequencies below 200 MHz you can and should operate at the lower 1.10 v setting for power consumption considerations. Along with the “respecting the rules” perspective this also follows what is expected from operating a processor at a higher frequency. A higher frequency often requires a higher voltage for stability, or at least for a guarantee of stability. I read this note as saying that if you are going to run at over 200 MHz you need to goose the voltage up to 1.20v for stability considerations, but if you are running below 200 MHz 1.10v will give you a stable processor.

Of course all this so far ignores the 1.05v setting. My opinion is that the processor will run just fine at 65MHz at 1.05v but I have nothing to back that up with besides the parallel logic that it needs 1.20v to run at 250MHz but only 1.10v to run at 200 MHz. No, it does not explicitly state that it’s “safe” to use 1.05v but the setting is there is it not? We’re looking at running 1/3 of the frequency while using .05 less volts. But, once again, I understand the desire to respect what is allowed by the datasheet.

Maybe someone with more of an electrical engineering/ hardware background could give an opinion on this. Bertrik? Saratoga?

“How much time exactly does it take to read the voltage?”
I think it takes 212 µs to read the voltage and the change happens faster than that. I will try to find the number of ascodec_reads it takes before the voltage reads high enough but right now I believe it is 1…

I can confirm that the TIMER2_VALUE delta for this line " ascodec_read(AS3514_ADC_1);” is 317 which means it takes 212 µs to read the voltage.

Why are you doing ascodec_read() anyway? Is that needed? We can’t just change the two bits of CVDD no matter of their previous state?

And Re the above: The problem with leaving specification is that you enter a state where it may or may not work. The success rate is depending on the quality of the manufactoring process (for example). As we’re going to release a software that runs on this devices, we basically can’t risk that our software doesn’t run because of the very few, unlucky people, that got a sansa which cannot manage 1.05V.
We can’t guarantee Rockbox will run on any Fuze, e200v2 etc, because AMS can’t guarantee their hardware can take the volts.

But on the other hand, undervolting such a common praxis. I have undervolted all my computers at home. They’re much cooler, and obviously use much less energy. Hence I’m curious as to what the OF does.

I really doubt there will be problems, I would like go with the “Typ: 1.10V” setting (which is the third one, and also not “allowed”). But we should respect the specifications, really.

In the early system_init(), the OF does CVDD/DCDC3 = (CVDD/DCDC3 & ~0x3f) | 0×16;
So CVDD is set to 10b = 1.10V.

I have not seen other modifications of this ascodec register, but I may have missed some. I assume the OF doesn’t use a fclk higher than 200MHz, but here also I am not sure.

But if it uses 1.10V setting for < 200MHz then we could use it as well.

kugel: we need to use ascodec_read() at least to read the other bits of CVDD/DCDC3 register, and then to wait until the voltage has been set successfully.

FlynDice can you update the patch with voltage reading after setting switch ? I believe we only need to block before switching to a fclk > 200MHz

Why are you doing ascodec_read() anyway?

We need to use ascodec_read to read the voltage from the ADC. To read the actual voltage we need to read both ADC_0 and ADC_1 but to just see if the voltage has gone from 1.05 to 1.20 we can get away with just reading the 8 bits from ADC_1.

During the actual voltage changing I guess you could do it without the ascodec_read but then you’re assuming that nothing else is manipulating the other 6 bits in this register and you _know_ what those bits should be set to. I think you could make that work in this case.

re the datasheet: I think 1.10 volts is “allowed” at less than 200 MHz. I think it is not “allowed” at > 200 MHz.

FlynDice read the 4 messages starting from: “Comment by Rafaël Carré (funman) - Sunday, 28 June 2009, 23:01 GMT+1”

1.10 _typical_ setting is not allowed because it has a small “See Note (1)” on page 13, but the OF uses it anyway..

So let’s use it. It seems the better overall setting, and the OF uses it too.

“kugel: we need to use ascodec_read() at least to read the other bits of CVDD/DCDC3 register, and then to wait until the voltage has been set successfully.” – alright, I missed the DCDC3 bit :/

Ok, I have made the reduced voltage 1.10v and taken out the conditional checks for CPUFREQ_MAX, just assuming we are going with a 250 MHz max frequency. The code now reads the 8 LSB’s of the CVDD voltage and waits until the reading is 1.20 volts before switching to synchronous bus mode and increasing fclk. I added some defines to better show what is being set in the DCDC3 register also. All I can tell you is that it compiles just fine, I’ve got a 1.10v battery bench running on my player right now so I can’t test it myself for awhile.

I have a few syntaxic remarks:

I think it would be better to move the new defines next to the code using them, and also perhaps give better names (prefix with AS3514_CVDD_DCDC3_ for example)
Also if 224 is written in hex, it would make it more obvious that it’s the 8LSB of the read voltage

The CPUFREQ_MAX check was good IMO. I’d like to see that the #defines are about which bits are set (0, 1«0, 1«1, 1«1|1«0), but that’s just my preference.

re: better names for the defines

I actually started out using some AS3514_XXXXX_XXX names but when I was done and tried to read it I started getting confused myself so I changed them to what is there now. I will be more than happy to rename them to whatever makes the most sense though. I was hoping to use FLYNDICES_FANTASTIK_OSCILLATING_MAGIC_PLASMA_ELECTRON_GENERATING_GIZMODIC_OPTIMUM_SETTINGS but decided against it as it did not respect the 80 column limit and I just couldn’t decide what to leave out…. ;P

On a serious note though I was thinking that instead of moving the defines closer to the code using them that it would make more sense to move them into a .h file like powermgmt-target.h as they are really related to the power management settings of the chip. What are your thoughts?

I have tried to use your input to make it better see what you think.

I have changed the CVDDp high/low readout on the viewhw page to display the actual voltage being read from the ADC. I guess you still need to figure out where the decimal point goes on your own but we’re all clever folks here , right?

Haha, I laughed! :D

I’d move the #defines to ascodec-target.h, actually, as we use them for ascodec.

For the debug menu, I just made up a quick switch case where a string like “1.20V” assigned to a variable which was printed in _DEBUG_PRINTF. But I’m not sure what the best way here is.

Or perhaps even to as3514.h, since lots of bits are defined here already with simple prefixes (CHG_, ADC_ ..)

Or we could create as3525/a/special/directory/where/flyndice/can/feel/free/to/ignore/the/80/columns/limit/ ? :P

EDIT: the patch is fine so commit asap so we can all save the planet (and move towards a releasable state if we aren’t already at it ..)

Do we actually know this defines are also true for as3414? AFAIK PP5024/as3414 don’t feature voltage changing.

The 3414 can change the voltage while running but it was decided we shouldn’t do that since we don’t have a voltage spec for the PP5024.


Available keyboard shortcuts


Task Details

Task Editing