The following code doesn't work for device Its not going into the while loop on device but it runs on the simulator.
int status;
char value[1024] = "abcd";
FILE *fp = popen("openssl enc -aes-128-cbc -k secret -P -md sha1 2>&1", "r");
if (fp == NULL)
exit(1); // handle error
int i=0;
NSString *strAESKey;
while (fgets(value, 1024, fp) != NULL)
{
i++;
if(i==2)
{
strAESKey=[NSString stringWithFormat:#"%s",value];
break;
}
}
status = pclose(fp);
if (status == -1)
{
/* Error reported by pclose() */
}
else
{
/* Use macros described under wait() to inspect `status' in order
to determine success/failure of command executed by popen() */
}
Where am I going wrong?
The iOS application sandbox forbids use of the fork function, which popen uses. The simulator doesn't use the sandbox, but devices do.
You will need to use the openssl library directly instead of using the command-line program. The iOS public API doesn't include the openssl library, so you'll need to build a static library yourself. You can find some help doing this by searching. I'd start with this blog post.
Related
We used linphone for the VoIP app and faced issues in conference call duration. Duration not increased after merge call successfully.
Every time displays 00:00 but the call is generated successfully.We used below code for conference call
LinphoneAddress* linphoneAddress = linphone_core_interpret_url([LinphoneManager
getLc], [username cStringUsingEncoding:[NSString defaultCStringEncoding]]);
if (linphoneAddress == NULL) {
return;
}
linphone_core_enter_conference(LC);
LinphoneCall *call = linphone_core_invite_address(LC, linphoneAddress);
-(NSString *)getCallDuration
{
LinphoneCore *lc = [LinphoneManager getLc];
int duration =
linphone_core_get_current_call(lc) ?
linphone_call_get_duration(linphone_core_get_current_call(lc)) :
0;
NSString *str_duration = [LinphoneUtils
durationToString:duration];
return str_duration;
}
So anyone has any solution then please help us.
Thank you.
What I try to achieve here is to encrypt a message inside ESP32 app built using PlatformIO + Arduino framework.
After some searchings, I found this repo: https://github.com/espressif/arduino-esp32
There is a tool inside it seems able to help me achieve what I want https://github.com/espressif/arduino-esp32/blob/master/tools/sdk/include/mbedtls/mbedtls/rsa.h
I imported the library "mbedtls" at https://platformio.org/lib/show/10874/mbedtls to the PlatformIO project and start work from there.
Question: How to load private key file in the app and encrypt the message using the RSA tool?
What I have currently is:
int ret = 1;
char buf[1024];
mbedtls_pk_init(&pk);
memset(buf, 0, sizeof(buf));
mbedtls_mpi_init(&N);
mbedtls_mpi_init(&P);
mbedtls_mpi_init(&Q);
mbedtls_mpi_init(&D);
mbedtls_mpi_init(&E);
mbedtls_mpi_init(&DP);
mbedtls_mpi_init(&DQ);
mbedtls_mpi_init(&QP);
ret = mbedtls_pk_parse_key(&pk, vendorPrivateKey, sizeof(vendorPrivateKey), NULL, NULL);
if (ret != 0) {
Serial.print(" failed! mbedtls_pk_parse_key returned: ");
Serial.print(-ret);
Serial.println();
}
if (mbedtls_pk_get_type(&pk) == MBEDTLS_PK_RSA) {
mbedtls_rsa_context *rsa = mbedtls_pk_rsa(pk);
if ((ret = mbedtls_rsa_export(rsa, &N, &P, &Q, &D, &E)) != 0
|| (ret = mbedtls_rsa_export_crt(rsa, &DP, &DQ, &QP)) != 0) {
Serial.println(" failed! could not export RSA parameters.");
}
}
For now I import the private key content directly in char* form (I'm not sure how to import a pem key file into app.) through the header file:
const unsigned char *vendorPrivateKey = reinterpret_cast<const unsigned char *>(VENDOR_PRIVATE_KEY);
where the value is stored inside secrets.h
Then when I ran the program, it yields the following error message for me:
failed! mbedtls_pk_parse_key returned: 15616
According to the pk.h file description, this error code 15616 in hexa is 3D00 which indicates /**< Invalid key tag or value. */
Is there any website that provides format checking and see if my private key file fits the requirements of the mbedtls?
I tried to get the STM32F446ZE and the STemWin Library running in a Keil Project.
The used OS is embOS from Segger.
I tried to follow the instructions from ST to display "hello world" on the display.
I use the display driver ST7735S which should be supported by the FlexColor driver.
My problem is that instead of "hello world" the display stays white.
I configured GUIConf.h like that:
#define GUI_NUMBYTES 0x8000
In GUI_x I added the following in GUI_X_init() :
void GUI_X_Init(void) {
UC_ST7735_init();
USInt current = Sopas_usiGet_IOLink14360ST7735DisplayBacklightCurrent();
LITA_setDisplayCurrent((uint8_t)current);
}
In UC_ST7735_init() the SPI-Port gets intiated. This part should work because it ist already used in the same sensor type. The only difference is, that I removed the graphcis library we used before. The addition lines activate the display and it's backlight.
In LCDConf_FlexColor_Template.c I set the resolution like this:
#define XSIZE_PHYS 100 // To be adapted to x-screen size
#define YSIZE_PHYS 40 // To be adapted to y-screen size
The follwing three methods I filled like that:
It should be metioned that UC_ST7735_writeSPI4Wire was also used before and should work. It sends one byte after each other instead of 16 bit at once.
I tried to send the high byte and the low byte in different orders.
static void LcdWriteReg(U16 Data) {
uint8_t data[2];
UC_ST7735_dcFLAG_t dcFlag = UC_ST7735_COMMAND;
data[0] = (uint8_t)UTIL_HLP_LOW_BYTE_OF_INT16(Data);
data[1] = (uint8_t)UTIL_HLP_HIGH_BYTE_OF_INT16(Data);
UC_ST7735_writeSPI4Wire(&data[1], 1, dcFlag);
UC_ST7735_writeSPI4Wire(&data[0], 1, dcFlag);
}
static void LcdWriteData(U16 Data) {
UC_ST7735_dcFLAG_t dcFlag = UC_ST7735_DATA;
uint8_t data[2];
data[0] = (uint8_t)UTIL_HLP_LOW_BYTE_OF_INT16(Data);
data[1] = (uint8_t)UTIL_HLP_HIGH_BYTE_OF_INT16(Data);
UC_ST7735_writeSPI4Wire(&data[1], 1, dcFlag);
UC_ST7735_writeSPI4Wire(&data[0], 1, dcFlag);
}
static void LcdWriteDataMultiple(U16 * pData, int NumItems) {
UC_ST7735_dcFLAG_t dcFlag = UC_ST7735_DATA;
uint8_t data[2];
int start_size = NumItems;
while (NumItems--) {
data[0] = (uint8_t)UTIL_HLP_LOW_BYTE_OF_INT16(pData[start_size-NumItems]);
data[1] = (uint8_t)UTIL_HLP_HIGH_BYTE_OF_INT16(pData[start_size-NumItems]);
UC_ST7735_writeSPI4Wire(&data[1], 1, dcFlag);
UC_ST7735_writeSPI4Wire(&data[0], 1, dcFlag);
}
}
I didn't change LCD_X_Config()
void LCD_X_Config(void) {
GUI_DEVICE * pDevice;
CONFIG_FLEXCOLOR Config = {0};
GUI_PORT_API PortAPI = {0};
//
// Set display driver and color conversion
//
pDevice = GUI_DEVICE_CreateAndLink(GUIDRV_FLEXCOLOR, GUICC_565, 0, 0);
//
// Display driver configuration, required for Lin-driver
//
LCD_SetSizeEx (0, XSIZE_PHYS , YSIZE_PHYS);
LCD_SetVSizeEx(0, VXSIZE_PHYS, VYSIZE_PHYS);
//
// Orientation
//
Config.Orientation = GUI_SWAP_XY | GUI_MIRROR_Y;
GUIDRV_FlexColor_Config(pDevice, &Config);
//
// Set controller and operation mode
//
PortAPI.pfWrite16_A0 = LcdWriteReg;
PortAPI.pfWrite16_A1 = LcdWriteData;
PortAPI.pfWriteM16_A1 = LcdWriteDataMultiple;
PortAPI.pfReadM16_A1 = LcdReadDataMultiple;
GUIDRV_FlexColor_SetFunc(pDevice, &PortAPI, GUIDRV_FLEXCOLOR_F66708, GUIDRV_FLEXCOLOR_M16C0B16);
}
The code I run is:
GUI_Init();
int xPos, yPos;
//__HAL_RCC_CRC_CLK_ENABLE();
xPos = LCD_GetXSize() / 2;
yPos = LCD_GetYSize() / 3;
GUI_SetFont(GUI_FONT_COMIC24B_ASCII);
GUI_DispStringHCenterAt("Hello world!", xPos, yPos);
// Endless loop:
while(true)
{
OS_TASK_Delay(100);
MW_TWD_arm(); // Arm the watchdog
}
There is no Compile-Error and the Watch Dog does not cause a problem.
The problem is that the display stays white.
I noticed that the command number which the flexchain driver sends over spi does not match the excpeted numbers from the St7735 datasheet.
I debugged the SPI method. The follwing commands and data got sent.
a command with 0x00 as the high byte
a command with 0x03 as the low byte. As far I know there is no command number 3 for theSTt7735
data with 0x00 as the high byte
data with 0x00 as the low byte
a command with 0x00 as the high byte
a command with 0x50 ('P') as the low byte.
data with 0x00 as the high byte
data with 0x00 as the low byte
a command with 0x00 as the high byte
a command with 0x51 ('Q') as the low byte.
data with 0x00 as the high byte
data with 0x63 as the low byte
a command with 0x00 as the high byte
a command with 0x52 as the low byte.
data with 0x00 as the high byte
data with 0x00 as the low byte ...
after a while with more alternating commands and data there is just data with 0x00
Did I forget to config something?
Update:
Changing GUIDRV_FLEXCOLOR_F66708 to GUIDRV_FLEXCOLOR_F667089 helped in order to get the right command numbers for the ST7735 display driver.
Changing GUIDRV_FLEXCOLOR_M16C0B16 to GUIDRV_FLEXCOLOR_M16C1B8 doesn't. My device reboots before any SPI communication happens.
I debugged the program to known when The error occurs.
I runs the following methods without crashing but after the last one I didn't see an other name of a method in the disassmbly window of Keil.
emwin_LCD_init
LCD_SETBk_ColorIndex
LCD_set_ClipRectMax
GUI_Alloc_getFixedBlock
GUI_Device_GetpDriver
GUIDRV_Flexcolor_InitOnce
GUI_Alloc_getFixedBlock
LCD_X_DisplayDriver
It get's to the end of LCD_X_DisplayDriver but never to LcdWriteReg, LcdWriteData, LcdWriteDataMultiple or LcdReadDataMultiple.
What could be the problem?
I am trying to erase one page in flash on an STM32F103RB like so:
FLASH_Unlock();
FLASH_ClearFlag(FLASH_FLAG_BSY | FLASH_FLAG_EOP | FLASH_FLAG_PGERR | FLASH_FLAG_WRPRTERR | FLASH_FLAG_OPTERR);
FLASHStatus = FLASH_ErasePage(Page);
However, FLASH_ErasePage fails producing FLASH_ERROR_WRP
Manually enabling/disabling write protection in the stm32-linker tool doesn't fix the problem.
Basically FLASH_ErasePage fails with WRP error without trying to do anything if there's previous WRP error in the status register.
What comes to your FLASH_ClearFlag call, at least FLASH_FLAG_BSY will cause assert_param(IS_FLASH_CLEAR_FLAG(FLASH_FLAG)); to fail (though I'm not really sure what happens in this case).
#define IS_FLASH_CLEAR_FLAG(FLAG) ((((FLAG) & (uint32_t)0xFFFFC0FD) == 0x00000000) && ((FLAG) != 0x00000000))
What is your page address ? Which address are you trying to access ?
For instance, this example is tested on STM32F100C8 in terms of not only erasing but also writing data correctly.
http://www.ozturkibrahim.com/TR/eeprom-emulation-on-stm32/
If using the HAL driver, your code might look like this (cut'n paste from an real project)
static HAL_StatusTypeDef Erase_Main_Program ()
{
FLASH_EraseInitTypeDef ins;
uint32_t sectorerror;
ins.TypeErase = FLASH_TYPEERASE_SECTORS;
ins.Banks = FLASH_BANK_1; /* Do not care, used for mass-erase */
#warning We currently erase from sector 2 (only keep 64KB of flash for boot))
ins.Sector = FLASH_SECTOR_4;
ins.NbSectors = 4;
ins.VoltageRange = FLASH_VOLTAGE_RANGE_3; /* voltage-range defines how big blocks can be erased at the same time */
return HAL_FLASHEx_Erase (&ins, §orerror);
}
The internal function in the HAL driver that actually does the work
void FLASH_Erase_Sector(uint32_t Sector, uint8_t VoltageRange)
{
uint32_t tmp_psize = 0U;
/* Check the parameters */
assert_param(IS_FLASH_SECTOR(Sector));
assert_param(IS_VOLTAGERANGE(VoltageRange));
if(VoltageRange == FLASH_VOLTAGE_RANGE_1)
{
tmp_psize = FLASH_PSIZE_BYTE;
}
else if(VoltageRange == FLASH_VOLTAGE_RANGE_2)
{
tmp_psize = FLASH_PSIZE_HALF_WORD;
}
else if(VoltageRange == FLASH_VOLTAGE_RANGE_3)
{
tmp_psize = FLASH_PSIZE_WORD;
}
else
{
tmp_psize = FLASH_PSIZE_DOUBLE_WORD;
}
/* If the previous operation is completed, proceed to erase the sector */
CLEAR_BIT(FLASH->CR, FLASH_CR_PSIZE);
FLASH->CR |= tmp_psize;
CLEAR_BIT(FLASH->CR, FLASH_CR_SNB);
FLASH->CR |= FLASH_CR_SER | (Sector << POSITION_VAL(FLASH_CR_SNB));
FLASH->CR |= FLASH_CR_STRT;
}
Second thing to check. Is interrupts enabled, and is there any hardware access between the unlock call and the erase call?
I hope this helps
I am currently working on PostgreSQL backup and restore functionality for my project. I have read this http://www.codeproject.com/Articles/37154/PostgreSQL-PostGis-Operations article and followed that approach to do this. It is working fine but recently I have changed the PostgreSQL authentication method to password in the pg_hba.con file. Hence it started prompting for the password whenever I execute psql.exe, pg_dump.exe, and pg_restore.exe. To provide the password through my project, I have used the "RedirectStandardInput" method. But it did not work and psql or pg_dump still prompt for the password. However "RedirectStandardOutput" and error methods are working fine.
I went through the PostgreSQL source code and found that GetConsoleMode and SetConsoleMode are used the remove the echo. I hope ( not sure ) it could be the reason, which is why I am unable to redirect the input.
PostgreSQL source code to prompt the password
simple_prompt(const char *prompt, int maxlen, bool echo)
{
int length;
char *destination;
FILE *termin,
*termout;
#ifdef HAVE_TERMIOS_H
struct termios t_orig,
t;
#else
#ifdef WIN32
HANDLE t = NULL;
LPDWORD t_orig = NULL;
#endif
#endif
destination = (char *) malloc(maxlen + 1);
if (!destination)
return NULL;
/*
* Do not try to collapse these into one "w+" mode file. Doesn't work on
* some platforms (eg, HPUX 10.20).
*/
termin = fopen(DEVTTY, "r");
termout = fopen(DEVTTY, "w");
if (!termin || !termout
#ifdef WIN32
/* See DEVTTY comment for msys */
|| (getenv("OSTYPE") && strcmp(getenv("OSTYPE"), "msys") == 0)
#endif
)
{
if (termin)
fclose(termin);
if (termout)
fclose(termout);
termin = stdin;
termout = stderr;
}
#ifdef HAVE_TERMIOS_H
if (!echo)
{
tcgetattr(fileno(termin), &t);
t_orig = t;
t.c_lflag &= ~ECHO;
tcsetattr(fileno(termin), TCSAFLUSH, &t);
}
#else
#ifdef WIN32
if (!echo)
{
/* get a new handle to turn echo off */
t_orig = (LPDWORD) malloc(sizeof(DWORD));
t = GetStdHandle(STD_INPUT_HANDLE);
/* save the old configuration first */
GetConsoleMode(t, t_orig);
/* set to the new mode */
SetConsoleMode(t, ENABLE_LINE_INPUT | ENABLE_PROCESSED_INPUT);
}
#endif
#endif
if (prompt)
{
fputs(_(prompt), termout);
fflush(termout);
}
if (fgets(destination, maxlen + 1, termin) == NULL)
destination[0] = '\0';
length = strlen(destination);
if (length > 0 && destination[length - 1] != '\n')
{
/* eat rest of the line */
char buf[128];
int buflen;
do
{
if (fgets(buf, sizeof(buf), termin) == NULL)
break;
buflen = strlen(buf);
} while (buflen > 0 && buf[buflen - 1] != '\n');
}
if (length > 0 && destination[length - 1] == '\n')
/* remove trailing newline */
destination[length - 1] = '\0';
#ifdef HAVE_TERMIOS_H
if (!echo)
{
tcsetattr(fileno(termin), TCSAFLUSH, &t_orig);
fputs("\n", termout);
fflush(termout);
}
#else
#ifdef WIN32
if (!echo)
{
/* reset to the original console mode */
SetConsoleMode(t, *t_orig);
fputs("\n", termout);
fflush(termout);
free(t_orig);
}
#endif
#endif
if (termin != stdin)
{
fclose(termin);
fclose(termout);
}
return destination;
}
Please help me here, how to send the password to psql or pg_dump via C# code.
Since this is local to the application, the best thing to do is to set %PGPASSWORD% and then psql will not ask for the password. .pgpass could be used if you wanted to avoid supplying a password at all. In general you do not want to specify environment variables from the command line since they may show for other users but this is not a concern here.