USB issue SAM3S-EK -> Custom card - crystal-reports

I am developing my project with SAM3S-EK demo board. I used USB CDC and MSC Driver with example code and ASF and everything work fine. Now I want to put the code into my custom card (with a SAM3S1B).
But that is my problem. I have assigned the pin and changed the clock config but the device is not recognized by Windows. All of descriptors are equal to zero (according to USBLyser).
Can someone help me ?
That is my clock config file (I have a 16MHz crystal) :
// ===== System Clock (MCK) Source Options
#define CONFIG_SYSCLK_SOURCE SYSCLK_SRC_PLLACK
// ===== System Clock (MCK) Prescaler Options (Fmck = Fsys / (SYSCLK_PRES))
#define CONFIG_SYSCLK_PRES SYSCLK_PRES_4
// ===== PLL0 (A) Options (Fpll = (Fclk * PLL_mul) / PLL_div)
// Use mul and div effective values here.
#define CONFIG_PLL0_SOURCE PLL_SRC_MAINCK_XTAL
#define CONFIG_PLL0_MUL 32
#define CONFIG_PLL0_DIV 2
// ===== PLL1 (B) Options (Fpll = (Fclk * PLL_mul) / PLL_div)
// Use mul and div effective values here.
#define CONFIG_PLL1_SOURCE PLL_SRC_MAINCK_12M_RC
#define CONFIG_PLL1_MUL 16
#define CONFIG_PLL1_DIV 2
// ===== USB Clock Source Options (Fusb = FpllX / USB_div)
// Use div effective value here.
//#define CONFIG_USBCLK_SOURCE USBCLK_SRC_PLL0
#define CONFIG_USBCLK_SOURCE USBCLK_SRC_PLL1
#define CONFIG_USBCLK_DIV 2
// ===== Target frequency (System clock)
// - XTAL frequency: 16MHz
// - System clock source: PLLA
// - System clock prescaler: 4 (divided by 4)
// - PLLA source: XTAL
// - PLLA output: XTAL * 32 / 3
// - System clock is: 16 * 32 / 4 / 2 = 64MHz
// ===== Target frequency (USB Clock)
// - USB clock source: PLLB
// - USB clock divider: 2 (divided by 2)
// - PLLB output: XTAL * 12 / 2
// - USB clock: 16 * 12 / 2 / 2 = 48MHz

Like all USB devices used under Windows, you need to first install a Windows side USB driver that is specific to the device you are attaching.
When you install Atmel Studio 6.2 or newer, it installs a Windows side USB driver for the Atmel ASF USB driver you are using in your firmware. That Windows driver works with my SAM4E target processor. Be aware that it takes a long time to load the driver in Windows. It will appear to have hung. Just give it time, and it will eventually install the driver. You will probably have to respond to a pop-up warning to permit installation of an unsigned driver.
The Windows driver can also be downloaded and installed separately. Use this link:
https://gallery.atmel.com/Products/Details/6272a8fd-68fe-43d8-a990-741878cfe7b6?

Double check your clock rate. I am using the SAM4L part and it requires the PLL to run off of OSC0 to generate a 48 MHz clock. I had the same problem for a while because my ABDACB used the same clock and changed the rate. As I understand it, the plugging in of a USB device senses the single pull up resistor on the pin DP or DN depending on the speed. That is what tells windows (the host) to attempt to communicate. If the clock rate is wrong, the properties in windows all show 0s.

Related

how to increase spidev bus speed

I want to communicate through SPI from an Up2 6000 and a microcontroller. On the Up2 I am using Ubuntu 20.04 with kernel 5.13 and the PREEMT_RT patch.
Linux up 5.13.0-rt1 #1 SMP PREEMPT_RT Thu Oct 13 12:09:18 CEST 2022 x86_64 x86_64 x86_64 GNU/Linux
Then my program to communicate through SPI sets the speed like this:
int file_desctiptor = open("/dev/spidev1.0", O_RDWR);
if (file_desctiptor < 0) {
return -2;
}
// Set the spi mode
if (ioctl(file_desctiptor, SPI_IOC_WR_MODE, SPI_MODE_1 | SPI_CS_HIGH) < 0) {
close(file_desctiptor);
return -3;
}
// Set the spi speed
uint32_t speed{8000000};
if (ioctl(file_desctiptor, SPI_IOC_WR_MAX_SPEED_HZ, &speed) < 0) {
close(file_desctiptor);
return -4;
}
But on the oscilloscope I can see the speed of the SCLK signal is 1MHz. However, when I change the speed value for less than 1MHz I can notice the difference on the oscilloscope, so it must be a maximum set somewhere.
To enable spidev on the Up2 6000 I use ACPI overloads. I also modified the max speed here:
/*
* This ASL can be used to declare a spidev device on SPI1 CS0
*/
DefinitionBlock ("", "SSDT", 5, "INTEL", "SPIDEV0", 0x00000001)
{
External (_SB.PC00.SPI1, DeviceObj)
Scope (\_SB.PC00.SPI1)
{
Device (TP0)
{
Name (_HID, "SPT0001") // _HID: Hardware ID
Name (_DDN, "SPI test device connected to CS0") // _DDN: DOS Device Name
Name (_CRS, ResourceTemplate () // _CRS: Current Resource Settings
{
SpiSerialBusV2 (0x0000, PolarityLow, FourWireMode, 0x08,
ControllerInitiated, 8000000, ClockPolarityLow,
ClockPhaseFirst, "\\_SB.PC00.SPI1",
0x00, ResourceConsumer, , Exclusive,
)
})
}
}
}
But the speed is still 1MHz. Am I missing something else?
Edit:
I have read back the speed using SPI_IOC_RD_MAX_SPEED_HZ and it is 8MHz but still on my logic analyzer I can see 1MHz.
I compiled my kernel with some logs to see what was happening in the drivers and I got this:
Apparently there are two instances of pxa2xx-spi driver, I suppose one is internal and we don't have much access (the first one with frequency 100MHz) and the other one is the one connected to the GPIO (so with frequency maximum limit of 990099 Hz).
Then, the SPT0001 ACPI module tries to modify this frequency to be 8MHz but as it is bigger than the one of the controller the driver does not set it. Finally, I try to open a file descriptor for SPI exchanges at 10MHz and again the driver refuses to open it because the controller has a smaller frequency.
Where is this frequency limit set? I tried compiling the kernel with different configurations but it didn't change. Is it something related to the hardware itself? Or perhaps another ACPI that I am missing?

Why does D2 RAM work correctly even when clock is disabled?

TL;DR: documentation states I have to enable a specific memory region in the microcontroller before I can use it. However, I can use it before enabling it, or even after disabling it. How is this possible?
I'm currently developing an application for the STM32H743 microcontroller. I don't understand how the RAM seems to work correctly while the clock is disabled.
This MCU has multiple memories, spread over multiple power domains:
In D1 domain it has ITCMRAM + DTCMRAM + AXI SRAM (64 + 128 + 512 kB)
In D2 domain it has SRAM1 + SRAM2 + SRAM3 (128 + 128 + 32 kB)
In D3 domain it has SRAM4 + Backup SRAM (64 + 4 kB)
I want to use the SRAM1. In the reference manual (RM0433 Rev. 7) it is stated at page 366 that:
If the CPU wants to use memories located into D2 domain (SRAM1, SRAM2 and SRAM3), it has to enable them.
In the register settings at page 452 it is described how to do this:
RCC AHB2 Clock Register (RCC_AHB2ENR):
SRAM1EN: SRAM1 block enable
Set and reset by software.
When set, this bit indicates that the SRAM1 is allocated by the CPU. It causes the D2 domain to
take into account also the CPU operation modes, i.e. keeping D2 domain in DRun when the CPU is
in CRun.
0: SRAM1 interface clock is disabled. (default after reset)
1: SRAM1 interface clock is enabled.
So, the default value (after reset) is 0, which means the SRAM1 interface is disabled.
In this thread on the STM Community forum the question was why the D2 RAM wasn't working correctly and the solution was to enable the D2 RAM clocks. The "correct" way to do this is in SystemInit() (part of the STM32H7 HAL). In system_stm32h7xx.c we can find the following code parts:
/************************* Miscellaneous Configuration ************************/
/*!< Uncomment the following line if you need to use initialized data in D2 domain SRAM (AHB SRAM)
*/
// #define DATA_IN_D2_SRAM
(...)
void SystemInit(void)
{
(...)
#if defined(DATA_IN_D2_SRAM)
/* in case of initialized data in D2 SRAM (AHB SRAM) , enable the D2 SRAM clock (AHB SRAM clock)
*/
# if defined(RCC_AHB2ENR_D2SRAM3EN)
RCC->AHB2ENR |= (RCC_AHB2ENR_D2SRAM1EN | RCC_AHB2ENR_D2SRAM2EN | RCC_AHB2ENR_D2SRAM3EN);
# elif defined(RCC_AHB2ENR_D2SRAM2EN)
RCC->AHB2ENR |= (RCC_AHB2ENR_D2SRAM1EN | RCC_AHB2ENR_D2SRAM2EN);
# else
RCC->AHB2ENR |= (RCC_AHB2ENR_AHBSRAM1EN | RCC_AHB2ENR_AHBSRAM2EN);
# endif /* RCC_AHB2ENR_D2SRAM3EN */
tmpreg = RCC->AHB2ENR;
(void)tmpreg;
#endif /* DATA_IN_D2_SRAM */
(...)
}
So, to use D2 SRAM, the macro DATA_IN_D2_SRAM should be defined (or you must manually enable the clock using __HAL_RCC_D2SRAM1_CLK_ENABLE()).
However, I don't have this macro defined, and even when I manually disable the clocks, the RAM seems to be working perfectly fine.
My main task (I'm running FreeRTOS, and this is the only task right now) is like this:
void main_task(void * argument)
{
__HAL_RCC_D2SRAM1_CLK_DISABLE();
__HAL_RCC_D2SRAM2_CLK_DISABLE();
__HAL_RCC_D2SRAM3_CLK_DISABLE();
mem_test(); // expected to fail, but runs successfully
for (;;) {}
}
The memory test completely fills the D2 SRAM with known data, then calculates a CRC over it. The CRC is correct. I already have verified that the buffer is really placed in the D2 SRAM (memory address 0x30000400 is within the range 0x30000000-0x3001FFFF of SRAM1). The value of RCC->AHB2ENR is confirmed to be 0 (all clocks disabled). I also confirmed that the address of RCC->AHB2ENR is 0x580244DC, as stated in the datasheet.
The data cache is disabled.
What am I missing here? Why is this memory readable and writable when the clocks are disabled?
UPDATE: On request, here is the code of my memory test, from which I conclude that the memory can be written and read successfully:
// NB: The sections are defined in the linker script.
static char test_data_d1[16] __attribute__((section(".RAM_D1_data"))) = "Test data in D1";
static char test_data_d2[16] __attribute__((section(".RAM_D2_data"))) = "Test data in D2";
static char test_data_d3[16] __attribute__((section(".RAM_D3_data"))) = "Test data in D3";
static char buffer_d1[256 * 1024ul] __attribute__((section(".RAM_D1_bss")));
static char buffer_d2[256 * 1024ul] __attribute__((section(".RAM_D2_bss")));
static char buffer_d3[ 32 * 1024ul] __attribute__((section(".RAM_D3_bss")));
static void mem_test(void)
{
// Fill the buffers each with a different test pattern.
fill_buffer_with_test_data(buffer_d1, sizeof(buffer_d1), test_data_d1);
fill_buffer_with_test_data(buffer_d2, sizeof(buffer_d2), test_data_d2);
fill_buffer_with_test_data(buffer_d3, sizeof(buffer_d3), test_data_d3);
uint32_t crc_d1 = crc32b((uint8_t const *)buffer_d1, sizeof(buffer_d1));
uint32_t crc_d2 = crc32b((uint8_t const *)buffer_d2, sizeof(buffer_d2));
uint32_t crc_d3 = crc32b((uint8_t const *)buffer_d3, sizeof(buffer_d3));
printf("CRC buffer_d1 = 0x%08lX\n", crc_d1);
printf("CRC buffer_d2 = 0x%08lX\n", crc_d2);
printf("CRC buffer_d3 = 0x%08lX\n", crc_d3);
assert(0xC29DFAED == crc_d1); // Python: hex(binascii.crc32(16384 * b'Test data in D1\0'))
assert(0x73B70C2A == crc_d2); // Python: hex(binascii.crc32(16384 * b'Test data in D2\0'))
assert(0xC30AE71E == crc_d3); // Python: hex(binascii.crc32(2048 * b'Test data in D3\0'))
}
After lots of testing and investigating I found out that the D2 SRAM was disabled (as documented and expected) in a minimal application using the SysTick and only a few LEDs to make the test results visible. However, when using a timer (TIM1) instead of SysTick, or when enabling a USART, the D2 SRAM was enabled as well, even when I did not enable it in my code. In fact, adding either one of the following lines of code would implicitly enable the D2 SRAM:
__HAL_RCC_TIM1_CLK_ENABLE();
__HAL_RCC_USART3_CLK_ENABLE();
STM support has confirmed this behavior:
D2 SRAM is activated as soon as any peripheral in D2 is activated. It means that If you enable clock for any peripheral located in D2 domain (AHB1, AHB2, APB1 and APB2), D2 SRAM is active even if RCC->AHB2ENR is 0.
I'm still looking for a reliable source (reference manual) where this behavior is documented, but it seems to be a plausible explanation.
In practice I think this means that the D2 SRAM will almost always be enabled automagically so you don't have to care about it, at least for the most common use cases (e.g. when using any peripheral or the DMA controllers). Only when you want to use the D2 SRAM but none of the D2 peripherals, you would have to manually enable the SRAM clocks. This would also be the case for the startup code, where (if you choose to implement this) the D2 SRAM will be initialized before any of the peripherals are enabled.

How can I change the start address on flash?

I'm using STM32F746ZG and FreeRTOS.
The start address of flash is 0x08000000. But I want to change it to 0x08040000. I've searched this issue through google but I didn't find the solution.
I changed the linker script like the following.
MEMORY
{
RAM (xrw) : ORIGIN = 0x20000000, LENGTH = 320K
/* FLASH (rx) : ORIGIN = 0x8000000, LENGTH = 1024K */
FLASH (rx) : ORIGIN = 0x8040000, LENGTH = 768K
}
If I only change it and run the debugger, it has the problem.
If I change the VECT_TAB_OFFSET from 0x00 to 0x4000, it works fine.
/* #define VECT_TAB_SRAM */
#define VECT_TAB_OFFSET 0x40000 /* 0x00 */
SCB->VTOR = FLASH_BASE | VECT_TAB_OFFSET;
But if I don't use debugger, it doesn't work anything.
It means it only works when using ST-Linker.
Please let me know if you know the solution.
Thank you for in advance of your reply.
The boot address can be set in the option bytes.
You can set any address in the flash with 16k increments. There are two 16 bit registers in the option bytes area, one is used when the boot pin is low at reset, the other when the pin is high. Write the desired address shifted right by 14 bits, i.e. divided by 16384.
To boot from 0x08040000, write 0x2010 into the register as described in the Option bytes programming chapter of the reference manual.
You could also write a bootloader. Bootloader sits on the 0x0800 0000 address and loads your application firmware meaning jumps to it.
This is the other way to do it.
You need to place 8 bytes at the original beginning of the FLASH. Stm32 boots always from the address 0x00000000 which is aliased to the one of the memories (depending on the boot pins and options).
The first word contains the stack pointer the second one your reset handler. You never get to your code as it boots always from the same address.
You will need to modify your linker script and the startup files where vectors are defined

STM32F4 USART1 sends garbage

I am having a problem when Sending char from STM32F411 to PC it reads into garbage, but when I do the opposite operation the MCU correctly reads char sent.
I perform following actions:
Enable GPIOA clock and configure pins 9 and 10 alternate function.
Enable USART1, leave default values for M (message length), stop bits, DMA
Set USARTDIV to result in 9600 baud at 16Mhz (HSI) *
Configure USART to send idle frame as first transmission
* I have also tried with 100Mhz APB2 bus frequency with the same result.
Configuring USART
// 1. Enable USART
SET_BIT(USART1->CR1, USART_CR1_UE);
// 5. Select the desired baud rate in BRR
SET_BIT(USART1->BRR, 0x683); // USARTDIV
// 6. Set TE in CR1 to send an idle frame as first transmission
SET_BIT(USART1->CR1, USART_CR1_TE);
After that I am trying to accept an a character with RealTerm2.0 with following configuration: 9600 8N1 None
Character is sent by following code:
void SendChar_USART(char pChar)
{
// Transmitter 7, 8
// 7. Write the data to send in the DR register (this clears TXE)
USART1->DR = pChar;
while(!READ_BIT(USART1->SR, USART_SR_TXE));
}
Update 1
Switching to USART2 with absolute same configuration solves the problem and it is possible to recover text from serial terminal, however this question unanswered "Why USART1 does not work as expected?"
There is a capacitor on the way to the PA9 pin of the extension connector filtering out the USART1 TX. Peter Harrison explains the issue very well, i think.
http://www.micromouseonline.com/2013/05/05/using-usart1-on-the-stm32f4discovery/

OLED on Zedboard

I am very new to zedboard. I have a zedboard running an Ubuntu image. I am trying to write a driver to run the OLED on the board. On board start-up the OLED on the board shows some display(Xilinx logo), therefore I am assuming it already has a driver. I have the following questions:
a) How is the OLED in the zedboard internally connected, is it through SPI, GPIOs or the PL. If it's through SPI/GPIOs then which pins?
b) Any tutorial or documentation that I can follow to create userspace drivers using SPI/GPIO for the OLED in the zedboard?
c) I have a redhat desktop, is there any SDk I can use to develop userspace drivers for the zedboard from my redhat desktop.
I have seen a lot of materials on the zedboard but none of them talks about how the OLED is internally connected. In one document it shows that it's connected to the PL. If that's the case then how can I write userspace drivers using the PL on the zedboard? I will be coding using C.
Appreciate your help and thanks in advance!
a) How is the OLED in the zedboard internally connected, is it through SPI, GPIOs or the PL. If it's through SPI/GPIOs then which pins?
First or second result for the web search "zedboard oled pdf" - http://zedboard.org/sites/default/files/ZedBoard_HW_UG_v1_1.pdf
then search for "oled" in it (page numbers of the pdf file, not printed in document):
page3: 2.4.4 OLED...... ... ...... 19
page4: 128x32 OLED Display
page5: ZYNQ XC7Z020-CSG484 OLED <- bus_of_5 -> 128x32 OLED
page20: 2.4.4 OLED
An Inteltronic/Wisechip UG-2832HSWEG04 OLED Display is used on the ZedBoard. This
provides a 128x32 pixel, passive-matrix, monochrome display. The display size is 30mm x11.5mm x 1.45mm. Table 11 - OLED Connections ... Interface
oled_pin symb EPP_pin Function
9 RES# U9 Power Reset for Controller and Driver
8 CS# N/C Chip Select – Pulled Down on Board
10 D/C# U10 Data/Command Control
11 SCLK AB12 Serial Clock Input Signal
12 SDIN AA12 Serial Data Input Signal
So, we know model of OLED UG-2832HSWEG04 (datasheet http://www.adafruit.com/datasheets/UG-2832HSWEG04.pdf with low-level details on data interface) and data connection; this is OLED with 1 serial data input and 1 serial clock.
Pinout pdf is http://www.xilinx.com/support/documentation/user_guides/ug865-Zynq-7000-Pkg-Pinout.pdf (too long to read), but there is shorter version of pin list in txt format: http://www.xilinx.com/support/packagefiles/z7packages/xc7z020clg484pkg.txt
Device/Package xc7z020clg484 9/18/2012 10:07:35
Pin Pin Name Memory Byte Group Bank VCCAUX Group Super Logic Region I/O Type
AA12 IO_L7P_T1_13 1 13 NA NA HR
AB12 IO_L7N_T1_13 1 13 NA NA HR
HR means "3.3V-capable high-range (HR) banks", both data pins are from "bank 13". Pin name is IO_* so it "supports both input, as well as output functionality", and is part of "PL Pins" (PL = programmable logic = FPGA). Default Zedboard firmware of FPGA part gives access to this pin to the ARM part of chip with linux kernel (PS = processing system = ARM) by routing it to some internal processing_system GPIO pin via system.ucf file like:
NET processing_system7_0_GPIO_pin[5] LOC = AB12 | IOSTANDARD="LVCMOS25"; # "OLED-SCLK"
NET processing_system7_0_GPIO_pin[6] LOC = AA12 | IOSTANDARD="LVCMOS25"; # "OLED-SDIN"
Then the GPIO pins are registered in devicetree (dts) https://github.com/Digilent/linux-digilent/blob/master/arch/arm/boot/dts/digilent-zed.dts in zed_oled group:
zed_oled {
compatible = "dglnt,pmodoled-gpio";
/* GPIO Pins */
vbat-gpio = <&ps7_gpio_0 55 0>;
vdd-gpio = <&ps7_gpio_0 56 0>;
res-gpio = <&ps7_gpio_0 57 0>;
dc-gpio = <&ps7_gpio_0 58 0>;
/* SPI-GPIOs */
spi-bus-num = <2>;
spi-speed-hz = <4000000>;
spi-sclk-gpio = <&ps7_gpio_0 59 0>;
spi-sdin-gpio = <&ps7_gpio_0 60 0>;
};
b) Any tutorial or documentation that I can follow to create userspace drivers using SPI/GPIO for the OLED in the zedboard?
According to Avnet's Getting Started pdf, "Demo 2 – OLED Display" scetion on page 17 (web searched as "zedboard oled") http://zedboard.org/sites/default/files/documentations/GS-AES-Z7EV-7Z020-G-14.1-V6%5B1%5D.pdf#page=17 there is kernel driver pmodoled-gpio.ko (on screenshot it reported as "pmodoled-gpio-spi"), so OLED is driven with GPIO pins.
There are two helper scripts: unload_oled to remove the kernel module and load_oled to insert it into kernel. Driver will create special device file /dev/zed_oled to work with display from user-space and load_oled also displays the /root/logo.bin file using this zed_oled interface.
Typical usage of zed_oled is like cat yourfile.bin > /dev/zed_oled, for example http://people.mech.kuleuven.be/~lin.zhang/notes/emebedded-linux/zedboard-oled-display.html and better http://zedboard.org/content/zedboard-oled
The .bin file format. ... The screen is written to right to left, top to bottom with each pixel being represented by a bit within one of the bytes within the .bin file. Bits are read-in top down 8 pixels then move over 1 pixel and write the next 8 bits and continue until you are at the end of the row. Then move down 8 pixels and do this again 3 more times.
You can do writes from C application, check code from http://www.cnblogs.com/popo0904/p/3853144.html (you can use online web translation services to read the text)
Documentation on the kernel module PmodOLED used in standard zedboard demo: https://github.com/Digilent/linux-digilent/blob/master/Documentation/pmods/pmodoled.txt
The driver provides a 512 Byte display buffer for the display of PmodOLED.
The Whole screen is divided into four lines, each of them is 128 bits wide
and 8 bits high, as shown in the figure below.
+--------------------------...----------------------------+
+ Line 4 +
+--------------------------...----------------------------+
+ Line 3 +
+--------------------------...----------------------------+
+ Line 2 +
+--------------------------...----------------------------+ MSB (bit 7)
+ Line 1 +
+--------------------------...----------------------------+ LSB (bit 0)
byte 127 byte 0
Users can perform read and write functions to the device node to access the data
inside the display buffer.
And there is source code of the dirver: https://github.com/Digilent/linux-digilent/blob/06b388903e5459687ba2538ae3895ffe29e2e39e/drivers/pmods/pmodoled-gpio.c
c) I have a redhat desktop, is there any SDk I can use to develop userspace drivers for the zedboard from my redhat desktop.
The standard driver is kernel-space for this OLED on ZEDboard, you can use it from precompiled ZEDboard firmware. Or you can build the kernel according to zedboard instructions, all in-kernel drivers will be built too (if enabled in kernel configuration): http://zedboard.org/content/creating-linux-kernel-image-boot-zc702-sd-card-slot