I am using the STM32F7 series of microcontrollers and it would be most helpful to have some GPIO change value (either toggle, pulse, high-z, ...) whenever the core is halted by the debugger attached to the JTAG interface. Is anyone aware of such a feature?
There are the DBGMCU registers, which can selectively stop certain peripherals (mostly timers) when the core is stopped.
The idea is to somehow make a timer output a low level signal while it's running, and high level when it isn't. A single timer can't do that, but it's possible with two timers in a master-slave configuration.
Configure TIM3 to output a PWM signal with a very high duty cycle, starting low for two cycles, then going high for the rest of it's 65536 cycles long period. Slave it to TIM2, which runs with a period of 2 cycles, and resets TIM3 on counter overflow. Thus, TIM3 is forced permamently low as long as TIM2 is running, but it will output a 99.997% high PWM signal when TIM2 is stopped. Then TIM2 is configured to stop when the core is halted by the debugger, but TIM3 keeps running.
RCC->AHBENR |= RCC_AHBENR_GPIOBEN; // enable peripheral clocks, that might be different on your board
RCC->APB1ENR |= RCC_APB1ENR_TIM2EN | RCC_APB1ENR_TIM3EN;
// consult your datasheet for the right AF value
GPIOB->AFR[0] = (GPIOB->AFR[0] & ~GPIO_AFRL_AFRL0) | 2; // set PB0 to Alternate Function 2, TIM3
GPIOB->MODER = (GPIOB->MODER & ~GPIO_MODER_MODER0) | GPIO_MODER_MODER0_1; // set PB0 to Alternate Function
DBGMCU->APB1FZ |= DBGMCU_APB1_FZ_DBG_TIM2_STOP; // stop TIM2 when core is stopped
DBGMCU->APB1FZ &= ~DBGMCU_APB1_FZ_DBG_TIM3_STOP; // but don't stop TIM3
TIM2->ARR = 1; // master timer period
TIM2->CR2 = TIM_CR2_MMS_1; // master mode selection MMS=010 Update event
TIM2->CR1 = TIM_CR1_CEN; // enable timer 2
TIM3->ARR = 65535; // PWM period
TIM3->CCR3 = 2; // channel 3 PWM duty cycle
TIM3->CCMR2 = TIM_CCMR2_OC3M; // set channel 3 to PWM mode 2
TIM3->CCER = TIM_CCER_CC3E // enable channel 3 compare output
/* | TIM_CCER_CC3P */; // it's possible to invert output polarity
TIM3->SMCR = TIM_SMCR_TS_0 // trigger selection TS=001 ITR1 = TIM2 is master
| TIM_SMCR_SMS_2; // slave mode SMS=100 reset mode
TIM3->CR1 = TIM_CR1_CEN; // enable timer 3
I don't have an F7, it runs on my STM32L151 board that happens to have a LED on PB0, which is TIM3 channel 3. The LED is lighting up nicely when I hit the suspend button in the debugger, the low pulse is not noticeable to the naked eye at all. Apply an external low pass RC filter to make it disappear when it bothers whatever component it's connected to. It might be possible to output a clean signal using the Retriggerable one pulse mode of the advanced timers TIM1 or TIM8, but I don't have any experience with these.
Related
I can't get the SPI on my STM32f3 discovery board (Datasheet) to work with gyroscope sensor (I3G4250D) on the register level. I know I'm sending data since I'm in full duplex and receiving dummy bytes from sensor using 16 bit data packing but when I try to receive using 8 bit access to DR register I get inconsistent values from sensor, sometimes returning one byte 0xff and other times returning 2 bytes 0xffff (at least I think that's what's happening) but no real values from from the sensor register I want to read. I think this has to do with automatic packing of STM32 SPI on my chip but I think I am addressing that by accessing DR register with uint8_t* but it doesn't seem to work. I also want to ask that when I compare the SPI protocol on sensor (datasheet page 24) and STM32 datasheet (page 729) I infer that both CPOL (clock polarity) and CPHA (clock phase) bits in STM32 SPI should be set but I seem to be able to at least send data with or without these bits set...
Here is my SPI Initialization function which includes trying to read bytes at the end of it and a write a byte to sensor register function:
void SPI_Init() {
/* Peripheral Clock Enable */
RCC->AHBENR |= RCC_AHBENR_GPIOEEN|RCC_AHBENR_GPIOAEN;
RCC->APB2ENR |= RCC_APB2ENR_SPI1EN;
/* GPIO Configuration */
GPIOA->MODER |= GPIO_MODER_MODER5_1|GPIO_MODER_MODER6_1|GPIO_MODER_MODER7_1; //Alternate function
GPIOA->OSPEEDR |= GPIO_OSPEEDER_OSPEEDR5|GPIO_OSPEEDER_OSPEEDR6|GPIO_OSPEEDER_OSPEEDR7; //High speed
GPIOA->AFR[0] |= 0x00500000|0x05000000|0x50000000; //AF for SCK,MISO,MOSI
GPIOE->MODER |= GPIO_MODER_MODER3_0; //Port E for NSS Pin
GPIOE->MODER |= GPIO_MODER_MODER3_0;
/* SPI Configuration */
SPI1->CR2 |= SPI_CR2_FRXTH|SPI_CR2_RXDMAEN; //Enable DMA but DMA is not used
// not sure if I need this?|SPI_CR1_CPOL|SPI_CR1_CPHA;
SPI1->CR1 |= SPI_CR1_BR_1|SPI_CR1_SSM|SPI_CR1_SSI|SPI_CR1_MSTR|SPI_CR1_SPE; //big endian, SPI#6MH, since using software set SPI_CR1_SSI to high for master mode
/* Slave Device Initialization */
SPI_WriteByte(CTRL_REG1_G,0x9f);
SPI_WriteByte(CTRL_REG4_G,0x10);
SPI_WriteByte(CTRL_REG5_G,0x10);
//receive test
uint8_t test =0xff;
uint8_t* spiDrPtr = (__IO uint8_t*)&SPI1->DR;
*spiDrPtr = 0x80|CTRL_REG1_G;
while(!(SPI1->SR & SPI_SR_TXE)){}
//SPI1->CR2 &= ~(SPI_CR2_FRXTH); //this is done in HAL not sure why though
*spiDrPtr = test; //Send dummy
while(!(SPI1->SR & SPI_SR_RXNE)){}
test = *spiDrPtr;
}
static void SPI_WriteByte(uint8_t regAdd, uint8_t data) {
uint8_t arr[2] = {regAdd,data}; //16 bit data packing
SPI1->DR = *((uint16_t*)arr);
}
Any suggestions?
Do not enable DMA if you do not use it.
You need to force 16 bit access (not 32 bits)
static void SPI_WriteByte(uint8_t regAdd, uint8_t data) {
uint8_t arr[2] = {regAdd,data}; //16 bit data packing
*(volatile uint16_t *)&SPI1->DR = *((volatile uint16_t*)arr);
}
Try using FRXTH = 0, and do all DR reads and writes in 16-bit words, then just discard the first byte.
To the later question, about CPOL/CPHA. These bits control SPI transfer format on the bit level. Refer to the following image (from wikipedia).
CPOL = 0, CPHA = 0 is also called "SPI mode 0", data bits are sampled on the SCK rising edge
CPOL = 1, CPHA = 1 is also called "SPI mode 3", data bits are also sampled on the rising edge.
The difference between two modes is SCK level between transfers, and an extra falling edge before the first bit.
Some chips states explicitly that both mode 0 and mode 3 are supported. Yet in the I3G4250D datasheet, section 5.2: "SDI and SDO are,
respectively, the serial port data input and output. These lines are driven at the falling edge
of SPC and should be captured at the rising edge of SPC."
When data is sent to the chip from mcu in mode 0, mcu drives the MOSI line before the first rising edge. Thus a slave chip can recieve valid data in both mode 0 and mode 3. But when data is transfered from chip to the mcu, chip may need the first falling SCK edge to shift/latch first data bit to the MISO line and with mode 0 you'll receive the readings shifted on one bit.
I've emphasised 'may' word, because the chip could still work correcly in both modes, latching the first bit with falling nSS-edge, the manufacturer just didn't do the testings, or doesn't guarantees that it'll work with other revisions and in all conditions.
I am a little bit confused regarding Cortex system timer on Cortex-M4 CPU.
Let's say, we have following configuration:
16MHz HSI as a clock source;
AHB1 prescaler sets to 1 (i.e. HSI divided by 1);
It means that main system bus (i.e. AHB1 or AHB) runs with the speed of 16 000 000 ticks per second. As far as I am concerned, system timer (so called SysTick) runs with the speed of main system bus, so it should count up to 16 000 000 each second. That seems obvious, but when I look at the Clock tree diagram in STM32F407xx reference manual I see this:
It looks like the system timer runs with the speed: (main system bus speed) / 8.
Is it true? I have configured system timer to generate interrupt each 16 000 000 ticks. Based on the configuration provided above (i.e. HSI as a clock source and AHB1 prescaler = 1) it generates interrupt each second, which toggles LED on and off. I have tried to measure the time between "blinks" and it seems to be exactly 1s. If there would be this prescaler (i.e. /8) then LED should toggle each 8s.
Below You can find code, which configures system clock source and system timer.
HSI frequency = 16 [MHz]
SYSTICKS_COUNT = 16 000 000
void system_clock_init(void)
{
LL_RCC_HSI_Enable();
while (LL_RCC_HSI_IsReady() != 1) {
;
}
LL_FLASH_SetLatency(LL_FLASH_LATENCY_0);
LL_RCC_SetAHBPrescaler(LL_RCC_SYSCLK_DIV_1);
LL_RCC_SetSysClkSource(LL_RCC_SYS_CLKSOURCE_HSI);
while (LL_RCC_GetSysClkSource() != LL_RCC_SYS_CLKSOURCE_STATUS_HSI) {
;
}
LL_RCC_SetAPB1Prescaler(LL_RCC_APB1_DIV_1);
LL_RCC_SetAPB2Prescaler(LL_RCC_APB2_DIV_1);
}
void system_clock_systick_config_init(void)
{
SysTick_Config(SYSTICKS_COUNT);
}
void SysTick_Handler(void)
{
led_toggle(LED_PIN_BOARD_GREEN);
}
The Reference Manual states at the end of the section "6.2 Clocks":
The RCC feeds the external clock of the Cortex System Timer (SysTick) with the AHB clock
(HCLK) divided by 8. The SysTick can work either with this clock or with the Cortex clock
(HCLK), configurable in the SysTick control and status register.
According to the STM32 Cortex-M4 programming manual Bit 2 of the SysTick control register (STK_CTRL) selects the clock source:
Bit 2 CLKSOURCE: Clock source selection
0: AHB/8
1: Processor clock (AHB)
According to the same manual, the default value should be 0 (using AHB/8). Apparently somewhere in your code this bit is set to 1 !?!
I am using internal ADC temp sensor , in a low power device without the sensor in stop mode ,the uController consumes around 4 uA but when the temp sensor is on the consumption goes up to 8-9 uA
the problem is i can not turn the sensor OFF / i just measured the off current by setting it off from start by stmcube
i am searching for a code that can turn off the temp sensor
up until now i have tested these:
1-
HAL_ADC_Init(&hadc);
hadc.Lock=HAL_UNLOCKED;
__HAL_UNLOCK(&hadc);
HAL_ADCEx_DisableVREFINTTempSensor();
2-
ADC1->CR&=0X00000000;
ADC->CCR&=~(1<<23);
i prefer to work with HAL , it dose not seems to cut the Sensor power
Your ADC1->CR &= 0x00000000; line looks wrong to me, depending on the controller you're using.
There is usually a bit to disable the ADC which needs to be set, rather than writing all 0s. Try ADC1->CR = (0x01 << 1); instead. If you have the ST Micro written defines for your processor ADC1->CR = ADC_CR_ADDIS; should be the same but more readable. After disabling the ADC you will be able to turn off the TSEN bit of ADC->CCR.
using cortex M3, arduino due
does anyone know if its possible so get a pwm channel to disable itself after so many pulses.
what i want to try out is something like this
Interrupt 1 fires (timer0),
it sets the delay of the pwm and how many cycles to go for
pwm starts and each pulse increments the counter, once the counter reaches its limit the pwm disables itself
what im NOT interested in is any other loop outside of the pwm settings doing the counting/disabling
Just add a interrupt to the timer which does the PWM (either from "update" of from "compare" - the result will slightly vary, so you must pick the one you prefer) and increment the counter there. Once the counter reaches your target value just disable the timer from interrupt and that's all.
There is a better way to handle this if your controller has DMA and DMA Iteration counter.
Configure the DMA channel to transmit a dummy byte upon each pulse. configure the DMA to raise an interrupt when Iteration counter reaches the threshold. You can stop the PWM counter inside this ISR handling. Since we are using DMA for pulse counting, there is very little load on the CPU.
I'm using the lightning drivers with windows IoT core to drive a PWM output. I've attached a scope to the GPIO pin and I set the PWM duty cycle. I do this in an infinite loop. If I put a delay in the loop then the output signal looks fine. If I drop the delay however, the duty cycle (as seen on the scope) starts to flicker between 5 and 10%. Code below, can anyone explain this?
var controllers = await PwmController.GetControllersAsync(LightningPwmProvider.GetPwmProvider());
var pwmController = controllers[1];
pwmController.SetDesiredFrequency(50);
var motor1 = pwmController.OpenPin(5);
motor1.Start();
do
{
motor1.SetActiveDutyCyclePercentage(0.05);
Task.Delay(1000).Wait();
} while (true);
I'm just guessing here, but could it be that SetActiveDutyCyclePercentage will actually reset the PWM counter, so it'll mess up the current cycle in the PWM. If you do it repeatedly then it'll mess up a lot of the cycles, vs. putting in a delay. Think about PWM as a counter that flips the output when it hits 0. If you reset the counter (with the SetActiveDutyCyclePercentage call) then the current cycle's total number of counts = length (before it flips the output) will be skewed.