I'm developing a C++ timestamp parser that could check if any given string can be a timestamp representation, covering various formats.
I've tested some libraries and finally, I'm using the single header one developed by #howard-hinnant.
The only problem is with the Kitchen format 03:04AM (HH:MM<AM|PM>).
This is the code that I'm using:
#include "date.h"
#include <iostream>
#include <string>
#include <sstream>
int main()
{
std::string const fmt = "%I:%M%p" ;
std::string const time;
std::string const name;
date::fields<std::chrono::nanoseconds> fds {};
std::chrono::minutes offset {};
std::string abbrev;
const std::string in = "3:04a.m.";
std::stringstream ss(in);
std::unordered_map<std::string, std::string> result;
date::from_stream(ss, fmt.c_str(), fds, &abbrev, &offset);
if (!ss.fail())
{
if (fds.has_tod && fds.tod.in_conventional_range())
{
std::cout << "result hour " << std::to_string(fds.tod.hours().count()) << std::endl;
std::cout << ". minutes " << std::to_string(fds.tod.minutes().count())<< std::endl;
std::cout << ". seconds " << std::to_string(fds.tod.seconds().count())<< std::endl;
}
}
else
{
std::cout << "failed" << std::endl;
}
}
What I'm doing wrong, the code works great with other formats? is there a chance that parsing a date requires more fields in order to process it fully (year, month, day)?
Hope I made myself clear, thanks in advance!
In the "C" locale, %p refers to one of AM or PM. You have "a.m.". Removing the '.' works for me.
There is one other caveat: The POSIX spec for strptime specifies that case should be ignored. And my date lib follows the POSIX spec on this. However by default this library forwards to your std::library for this functionality. And some implementations didn't get the memo on this. They may not accept lower case.
If this happens for you, you can work around this std::lib bug by compiling with -DONLY_C_LOCALE on the command line (or set ONLY_C_LOCALE=1 in your IDE wherever macros are set). This tells the date lib to do the %p parse itself, instead of forwarding to the std::lib. And it will correctly do a case-insensitive parse. However it assumes that the "C" locale is in effect.
Related
When importing a ply-file into my program I get an Error-message saying that something went wrong with the following message:
C:\Users\...\data\apple.ply:8: property 'list uint8 int32 vertex_indices' of element 'face' is not handled
I used a sample ply file from: https://people.sc.fsu.edu/~jburkardt/data/ply/apple.ply
I have already tried different ply files from different sources but none of them work. When debugging the program the io::loadPLYFile doesn't generate a valid pointcloud. Runtime Library for PCL and for my program are the same.
#include <iostream>
#include <pcl/io/pcd_io.h>
#include <pcl/io/ply_io.h>
#include <pcl/point_types.h>
#include <pcl/search/kdtree.h>
#include <pcl/features/normal_3d_omp.h>
#include <pcl/surface/marching_cubes_rbf.h>
using namespace pcl;
using namespace std;
int
main (int argc, char** argv)
{
PointCloud<PointXYZ>::Ptr cloud (new PointCloud<PointXYZ>);
std::cout << "Start Debug?" << std::endl;
std::cin.ignore();
if(io::loadPLYFile<PointXYZ> (argv[1], *cloud) == -1){
cout << "ERROR: couldn't find file" << endl;
return (1);
} else {
cout << "loaded" << endl;
NormalEstimationOMP<PointXYZ, Normal> ne;
search::KdTree<PointXYZ>::Ptr tree1 (new search::KdTree<PointXYZ>);
tree1->setInputCloud (cloud);
ne.setInputCloud (cloud);
ne.setSearchMethod (tree1);
ne.setKSearch (20);
PointCloud<Normal>::Ptr normals (new PointCloud<Normal>);
ne.compute (*normals);
I would expect the PCL function io::loadPLYFile to load the files properly as described in the documentation http://docs.pointclouds.org/1.3.1/group__io.html
the console output is just a warning as #kanstar already suggested! It can easily be ignored. The reason my program crashed in Debug but not in Release was that my Visual Studio linked to the wrong library version of boost which resulted in the crash. Fixing the linkage made the pcl::NormalEstimationOMP work as expected.
I'm trying to write a simple SIP sniffer using libtins which works nice.
I then try to parse the packet received to libosip.
Although it does parses the message properly, it dies silently.
I've no idea what could be wrong here, some help would be greatly appreciated!
this is my source:
#include <iostream>
#include "tins/tins.h"
#include <osip2/osip.h>
#include <osipparser2/osip_message.h>
#include <vector>
using namespace Tins;
bool invalidChar (char c);
void stripUnicode(std::string & str);
bool callback(const PDU &pdu)
{
const IP &ip = pdu.rfind_pdu<IP>(); // Find the IP layer
const UDP &udp = pdu.rfind_pdu<UDP>(); // Find the TCP layer
osip_message *sip;
osip_message_init(&sip);
// First here we print Source and Destination Information
std::cout << ip.src_addr() << ':' << udp.sport() << " -> "
<< ip.dst_addr() << ':' << udp.dport() << std::endl;
// Extract the RawPDU object.
const RawPDU& raw = udp.rfind_pdu<RawPDU>();
// Finally, take the payload (this is a vector<uint8_t>)
const RawPDU::payload_type& payload = raw.payload();
// We create a string message
std::string message( payload.begin(), payload.end() );
std::string sip_message;
// Try to parse the message
std::cout << "copying message with len " << message.size() << std::endl;
const char *msg = message.c_str();
std::cout << "parsing message with size " << strlen(msg) << std::endl;
osip_message_parse( sip, msg, strlen( msg ) );
std::cout << "freeing message" << std::endl;
osip_message_free(sip);
return true;
}
int main(int argc, char *argv[])
{
if(argc != 2) {
std::cout << "Usage: " << *argv << " <interface>" << std::endl;
return 1;
}
// Sniff on the provided interface in promiscuos mode
Sniffer sniffer(argv[1], Sniffer::PROMISC);
// Only capture udp packets sent to port 53
sniffer.set_filter("port 5060");
// Start the capture
sniffer.sniff_loop(callback);
}
The output is this:
1.2.3.4:5060 -> 4.3.2.1:5060
copying message with len 333
parsing message with size 333
And it dies silently.
If I remove the line:
osip_message_parse( sip, msg, strlen( msg ) );
It keeps going perfectly...
Thanks a lot for your help!
I finally found the problem.
it is necessary to initialise the parser with
parser_init();
It's not documented anywhere :(
Now it's not dying on me anymore, but the parsing is not working properly. I need to investigate more.
Thanks everyone!
David
First, if a memory corruption happens before, the crash may happen in osip_message_parse but this might not be the origin of the initial corruption.
In order to test a sip message with libosip, you can go into the build directory of osip and create a file containing your sip message: mymessage.txt
$> ./src/test/torture_test mymessage.txt 0 -v
and even for a deeper check with valgrind:
$> valgrind ./src/test/.libs/torture_test mymessage.txt 0 -v
If your code is failing for all sip message, I guess the issue is a memory corruption outside libosip.
You do have another bug with the size of the SIP message:
osip_message_parse( sip, msg, strlen( msg ) );
A SIP message can contain binary data with \0 char inside, so your code should use the exact length of binary payload not strlen(). Such a change is required (but won't fix your main issue):
osip_message_parse( sip, msg, payload.end() - payload.begin() );
I also advise you to try the latest osip git and complete your question with a copy of a SIP message failing.
EDIT: As David found, the init wasn't done and that was the origin of the issue. However, the correct way to init is as specified by first line of documentation:
How-To initialize libosip2
When using osip, your first task is to initialize the parser and the state machine. This must be done prior to any use of libosip2.
#include <sys/time.h>
#include <osip2/osip.h>
int i;
osip_t *osip;
i=osip_init(&osip);
if (i!=0)
return -1;
Maybe a stupid question but I like to decode these kind of unicode characters. How do I do it?
۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇̚̚̚̚̚̚̚̚̚̚̚̚
۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇̚̚̚̚̚̚̚̚̚̚̚̚
I like to be able to decode them into /uXXXX syntax so I can use them for instance in a web page...
If you're using windows you can try copying and pasting that into the character map application.
In C++ (with an implementation that supports C++11, such as is included in Xcode):
#include <iostream>
#include <iomanip>
#include <string>
#include <cstdint>
int main() {
std::u32string s = U"ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇͛̍ͪͩ́͒͆̓̉̽̍̏͂ͮ̈́ͦ̀ͤ͗̅͗̄̐̃ͬͮͣͩͮ̆̓́͛ͯͤͣͧ̔ͮ̈́ͯ̅۫ͫ̈́̊̃͛͐̎̂̓̃̇̚̚̚̚̚̚̚̚̚̚̚̚";
std::cout.fill('0');
std::cout << std::hex;
for (auto c : s)
std::cout << "\\u" << std::setw(4) << static_cast<std::uint_least32_t>(c);
}
I'm trying to pass arguments in XCode and understand you need to add them from the Args tab, using the Get Info button, in the Executables of the Groups and Files pane. I'm trying to see if I can get it to work, but am having some difficulty. My program is simply:
#include <iostream>
#include <ostream>
using namespace std;
int main(int argc, char *argv[]) {
for (int i = 0; i < argc; i++) {
cout << argv[i];
}
return 0;
}
And in the Args tab, I have the number 2 and then in another line the number 1. I do not get any output when I run the program. What am I doing wrong? Thanks!
Your code works fine and it displays the arguments.
You may want to print a new line after each argument to make the output more readable:
cout << argv[i] << "\n";
Output is visible in the console (use Command+Shift+R to bring up the console).
I have stdin in a select() set and I want to take a string from stdin whenever the user types it and hits Enter.
But select is triggering stdin as ready to read before Enter is hit, and, in rare cases, before anything is typed at all. This hangs my program on getstr() until I hit Enter.
I tried setting nocbreak() and it's perfect really except that nothing gets echoed to the screen so I can't see what I'm typing. And setting echo() doesn't change that.
I also tried using timeout(0), but the results of that was even crazier and didn't work.
What you need to do is tho check if a character is available with the getch() function. If you use it in no-delay mode the method will not block. Then you need to eat up the characters until you encounter a '\n', appending each char to the resulting string as you go.
Alternatively - and the method I use - is to use the GNU readline library. It has support for non-blocking behavior, but documentation about that section is not so excellent.
Included here is a small example that you can use. It has a select loop, and uses the GNU readline library:
#include <stdio.h>
#include <readline/readline.h>
#include <readline/history.h>
#include <stdlib.h>
#include <stdbool.h>
int quit = false;
void rl_cb(char* line)
{
if (NULL==line) {
quit = true;
return;
}
if(strlen(line) > 0) add_history(line);
printf("You typed:\n%s\n", line);
free(line);
}
int main()
{
struct timeval to;
const char *prompt = "# ";
rl_callback_handler_install(prompt, (rl_vcpfunc_t*) &rl_cb);
to.tv_sec = 0;
to.tv_usec = 10000;
while(1){
if (quit) break;
select(1, NULL, NULL, NULL, &to);
rl_callback_read_char();
};
rl_callback_handler_remove();
return 0;
}
Compile with:
gcc -Wall rl.c -lreadline