QCustomPlot using timestamp to set Date along one axis - date

I am having trouble setting the date properly. Basically I have timestamp, open, close, high, low, volume stored line by line in a text file (downloaded using Yahoo API). My program then reads each line and converts it to a QStringList. It then puts each item in the list into the appropriate QVector (dates[], open[], close[], high[], low[], volume[]) converting each item to a double. Here is where the problem is. It appears that the precision is lost during the conversion. The dates always show as periods back in 1970 when the actually timestamp is in fact a date from a few days ago.
#include "dialog.h"
#include "ui_dialog.h"
#include<QFile>
#include<QTextStream>
#include<string>
#include<iostream>
#include<ctime>
using namespace std;
Dialog::Dialog(QWidget *parent) :
QDialog(parent),
ui(new Ui::Dialog)
{
ui->setupUi(this);
QStringList lines;
QString line;
QVector<double> dates;
QVector<double> high;
QVector<double> low;
QVector<double> open;
QVector<double> close;
QVector<double> volume;
QFile file ("YHOO.cvs");
if(file.open(QIODevice::ReadOnly))
{
QTextStream in(&file);
while (!in.atEnd())
{
line = in.readLine();
lines = line.split(",");
dates.append(lines[0].toDouble());
close.append(lines[1].toDouble());
high.append(lines[2].toDouble());
low.append(lines[3].toDouble());
open.append(lines[4].toDouble());
volume.append(lines[5].toInt());
}
file.close();
}
else{
QMessageBox::information(0,"info",file.errorString());
}
ui->plot->addGraph();
ui->plot->graph(0)->setData(dates, high);
ui->plot->xAxis->setTickLabelType(QCPAxis::ltDateTime);
ui->plot->xAxis->setDateTimeFormat("MM/dd/yyyy");
QPen pen;
pen.setColor(QColor(200,200,200));
ui->plot->graph(0)->setPen(pen);
ui->plot->graph(0)->setLineStyle(QCPGraph::lsLine);
ui->plot->graph(0)->setBrush(QBrush(QColor(160,50,150)));
ui->plot->xAxis->setRange(dates[0], dates[dates.length()-1]);
ui->plot->yAxis->setRange(*std::min_element(high.begin(), high.end()),*std::max_element(high.begin(),high.end()));
}
Dialog::~Dialog()
{
delete ui;
}
YHOO.cvs
20140227,30.1000,30.1600,28.4100,29.7000,2351300
20140228,28.3000,32.0000,27.0000,29.2000,3781000
20140303,28.1900,28.9100,26.8900,27.3000,1664900
20140304,30.0400,30.3800,28.6300,28.8500,2341700
20140305,28.5500,29.5000,28.4900,29.2400,7314100
20140306,27.1700,29.0100,27.1500,28.7600,3007300
20140307,27.2000,28.3200,26.7100,27.8400,2961800
20140310,28.2400,28.5000,27.3500,27.7200,1622100
20140311,27.5300,28.7400,27.1800,28.4400,1745200
20140312,28.5400,28.7400,27.3500,27.4700,2206300

I figured it out. Turns out the first column of data is a date and not a timestamp. I came up with this function to convert the QString to a double value timestamp.
double timeStamp(QString qs){
char tempDate[10];
memcpy(tempDate, qs.toStdString().c_str(), 10);
char date[] = " ";
date[0] = tempDate[0];
date[1] = tempDate[1];
date[2] = tempDate[2];
date[3] = tempDate[3];
date[4] = '\0';
date[5] = tempDate[4];
date[6] = tempDate[5];
date[7] = '\0';
date[8] = tempDate[6];
date[9] = tempDate[7];
struct tm tmdate = {0};
tmdate.tm_year = atoi(&date[0]) - 1900;
tmdate.tm_mon = atoi(&date[5]) - 1;
tmdate.tm_mday = atoi(&date[8]);
time_t t = mktime( &tmdate );
double actual_time_sec = difftime(t,0);
return actual_time_sec;
}

Related

Xilinx Echo Server Data Variable

I want to have my Zedboard return a numeric value using the Xilinx lwIP example as a base but no matter what I do I can't figure out what stores the data received or transmitted.
I have found the void type payload but I don't know what to do with it.
Snapshot of one instance of payload and a list of lwIP files
Below is the closest function to my goal:
err_t recv_callback(void *arg, struct tcp_pcb *tpcb,
struct pbuf *p, err_t err){
/* do not read the packet if we are not in ESTABLISHED state */
if (!p) {
tcp_close(tpcb);
tcp_recv(tpcb, NULL);
return ERR_OK;
}
/* indicate that the packet has been received */
tcp_recved(tpcb, p->len);
/* echo back the payload */
/* in this case, we assume that the payload is < TCP_SND_BUF */
if (tcp_sndbuf(tpcb) > p->len) {
err = tcp_write(tpcb, p->payload, p->len, 1);
//I need to change p->paylod but IDK where it is given a value.
} else
xil_printf("no space in tcp_sndbuf\n\r");
/* free the received pbuf */
pbuf_free(p);
return ERR_OK;
}
Any guidance is appreciated.
Thanks,
Turtlemii
-I cheated and just made sure that the function has access to Global_tpcb from echo.c
-tcp_write() reads in an address and displays each char it seems.
void Print_Code()
{
/* Prepare for TRANSMISSION */
char header[] = "\rSwitch: 1 2 3 4 5 6 7 8\n\r"; //header text
char data_t[] = " \n\r\r"; //area for storing the
data
unsigned char mask = 10000000; //mask to decode switches
swc_value = XGpio_DiscreteRead(&SWCInst, 1); //Save switch values
/* Write switch values to the LEDs for visual. */
XGpio_DiscreteWrite(&LEDInst, LED_CHANNEL, swc_value);
for (int i =0; i<=7; i++) //load data_t with switch values (0/1)
{
data_t[8+2*i] = '0' + ((swc_value & mask)/mask); //convert one bit to 0/1
mask = mask >> 1;//move to next bit
}
int len_header = *(&header + 1) - header; //find the length of the
header string
int len_data = *(&data_t + 1) - data_t; //find the length of the data string
tcp_write(Global_tpcb, &header, len_header, 1); //print the header
tcp_write(Global_tpcb, &data_t, len_data, 1); //print the data
}

postgresql timestamp to std::chrono value

What is the appropriate way to work with the the postgresql datatype "timestamp without timezone" from c++ (libpqxx)? I haven't been able to find a way to do this yet.
I am restricted to the "timestamp without timezone" datatype in the postgresql and the environment is running utc time. I was hoping to find a mapping to a std::chrono::system_clock::time_point member but I can't find a such in libpqxx.
//s has a time_point var and r is a pqxx::result, r[0] is sensible
s.creationtime = r[0]["creationtime"].as<std::chrono::system_clock::time_point>();
With help from boost:
s.creationtime = makeTimePoint(r[0]["creationtime"].as<string>());
makeTimePoint:
std::chrono::system_clock::time_point makeTimePoint(const std::string& s)
{
using namespace boost::posix_time;
using namespace std::chrono;
const ptime ts = time_from_string(s);
auto seconds = to_time_t(ts);
time_duration td = ts.time_of_day();
auto microseconds = td.fractional_seconds();
auto d = std::chrono::seconds{seconds} + std::chrono::microseconds{microseconds};
system_clock::time_point tp{duration_cast<system_clock::duration>(d)};
return tp;
}
The C++20 spec introduces a family of chrono::time_points called local_time:
// [time.clock.local], local time
struct local_t {};
template<class Duration>
using local_time = time_point<local_t, Duration>;
using local_seconds = local_time<seconds>;
using local_days = local_time<days>;
These time_points represent a "timestamp without a timezone".
There exists a free, open-source preview of this C++20 library here:
https://github.com/HowardHinnant/date
which is currently in use by other projects around the globe. This library has a few minor changes from the C++20 spec, such as putting everything in namespace date instead of namespace std::chrono.
Example program using this library:
#include "date/date.h"
#include <iostream>
int
main()
{
using namespace date;
using namespace std::chrono;
int y = 2019;
int m = 8;
int d = 28;
int H = 14;
int M = 42;
int S = 16;
int US = 500'000;
local_time<microseconds> lt = local_days{year{y}/m/d} + hours{H} +
minutes{M} + seconds{S} + microseconds{US};
std::cout << lt << '\n';
}
Output:
2019-08-28 14:42:16.500000

Simple OS kernel -- screen output long string array issue

I am following the steps in: "Writing a Simple Operating System from Scratch". I got the basic kernel up and running, but got a strange error when I try to output a very long string.
The following code output all "0"s. But when I use a shorter "msg" string, then the screen prints "x".
I wonder if anybody can help me, thanks much! (I am testing it in Bochs in windows 8).
Here is the kernel.c
void start ()
{
// Create a pointer to a char , and point it to the first text cell of
// video memory (i.e. the top - left of the screen )
unsigned char *video_memory = ( unsigned char*) 0xb8000;
// At the address pointed to by video_memory , store the character 'X'
// (i.e. display 'X' in the top - left of the screen ).
// this string: the outputs are "0"s.
const char msg[] = "abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyzAbcdefghijkAbcdefghijklmnopqrstuvwxyzA";
// this string: output are "x"s.
//const char msg[] = "abcdefghijklmnopqrstuvwxyzabcdefghijk";
int i = 0;
int offset = 0;
while(i<85)
{
// Test if msg points to the right memory
if(msg[i]==0)
{
video_memory[offset] = '0';
}
else
{
video_memory[offset] = 'x';
}
video_memory[offset+1] = 0x02; // Green color
i = i + 1;
offset = offset + 2;
//set_cursor(offset);
}
}

GPS output being incorrectly written to file on SD card- Arduino

I have a sketch to take information (Lat, Long) from an EM-406a GPS receiver and write the information to an SD card on an Arduino shield.
The program is as follows:
#include <TinyGPS++.h>
#include <SoftwareSerial.h>
#include <SD.h>
TinyGPSPlus gps;
SoftwareSerial ss(4, 3); //pins for the GPS
Sd2Card card;
SdVolume volume;
SdFile root;
SdFile file;
void setup()
{
Serial.begin(115200); //for the serial output
ss.begin(4800); //start ss at 4800 baud
Serial.println("gpsLogger by Aaron McRuer");
Serial.println("based on code by Mikal Hart");
Serial.println();
//initialize the SD card
if(!card.init(SPI_FULL_SPEED, 9))
{
Serial.println("card.init failed");
}
//initialize a FAT volume
if(!volume.init(&card)){
Serial.println("volume.init failed");
}
//open the root directory
if(!root.openRoot(&volume)){
Serial.println("openRoot failed");
}
//create new file
char name[] = "WRITE00.TXT";
for (uint8_t i = 0; i < 100; i++){
name[5] = i/10 + '0';
name[6] = i%10 + '0';
if(file.open(&root, name, O_CREAT | O_EXCL | O_WRITE)){
break;
}
}
if(!file.isOpen())
{
Serial.println("file.create");
}
file.print("Ready...\n");
}
void loop()
{
bool newData = false;
//For one second we parse GPS data and report some key values
for (unsigned long start = millis(); millis() - start < 1000;)
{
while (ss.available())
{
char c = ss.read();
//Serial.write(c); //uncomment this line if you want to see the GPS data flowing
if(gps.encode(c)) //did a new valid sentence come in?
newData = true;
}
}
if(newData)
{
file.write(gps.location.lat());
file.write("\n");
file.write(gps.location.lng());
file.write("\n");
}
file.close();
}
When I open up the file on the SD card when the program is finished executing, I get a message that it has an encoding error.
I'm currently inside (and unable to get a GPS signal, thus the 0), but the encoding problem needs to be tackled, and there should be as many lines as there are seconds that the device has been on. There's only that one. What do I need to do to make things work correctly here?
Closing the file in the loop, and never reopening it, is the reason there's only one set of data in your file.
Are you sure gps.location.lat() and gps.location.lng() return strings, not an integer or float? That would explain the binary data and the "encoding error" you see.

Drawing currency symbol

How to draw a currency symbol in a custom label using CGContextShowTextAtPoint method in draw rect.
Here the symbol is in string format.
Any help!!
Thanks
You have to resort to C style strings, since this is what CGContextShowTextAtPoint() requires. In order to correctly handle the locale (the currency symbol changes with the locale) you must use setlocale(), then you format your string using strfmon() and finally you pass the string created with strfmon() to CGContextShowTextAtPoint().
Documentation is available as follows from the terminal:
man 3 setlocale
man 3 strfmon
EDIT/UPDATE: For your information, strfmon() internally uses struct lconv. The structure can be retrieved with the function localeconv(). See man 3 localeconv for a detailed description of the fields available in the structure.
for instance, try the following simple C program setting different locales
#include <stdio.h>
#include <locale.h>
#include <monetary.h>
int main(void)
{
char buf[BUFSIZ];
double val = 1234.567;
/* use your current locale */
setlocale(LC_ALL, "");
/* uncomment the next line and try this to use italian locale */
/* setlocale(LC_ALL, "it_IT"); */
strfmon(buf, sizeof buf, "You owe me %n (%i)\n", val, val);
fputs(buf, stdout);
return 0;
}
The following uses localeconv():
#include <stdio.h>
#include <limits.h>
#include <locale.h>
int main(void)
{
struct lconv l;
int i;
setlocale(LC_ALL, "");
l = *localeconv();
printf("decimal_point = [%s]\n", l.decimal_point);
printf("thousands_sep = [%s]\n", l.thousands_sep);
for (i = 0; l.grouping[i] != 0 && l.grouping[i] != CHAR_MAX; i++)
printf("grouping[%d] = [%d]\n", i, l.grouping[i]);
printf("int_curr_symbol = [%s]\n", l.int_curr_symbol);
printf("currency_symbol = [%s]\n", l.currency_symbol);
printf("mon_decimal_point = [%s]\n", l.mon_decimal_point);
printf("mon_thousands_sep = [%s]\n", l.mon_thousands_sep);
printf("positive_sign = [%s]\n", l.positive_sign);
printf("negative_sign = [%s]\n", l.negative_sign);
}
I don't really get what you're asking,
checking the documentation, the method would look something like that:
CGContextRef ctx = UIGraphicsGetCurrentContext();
const char *string = "$";
CGContextShowTextAtPoint (ctx, 160, 240, string, 1);
Haven't tested it, but this should draw $ in the center of the screen.
BTW, why not use images?
~ Natanavra.