I'm trying to develop an HTTPS server.
I already have a fine HTTP server and I've copied the code up to the point where a client sends their first message – the Client Hello. From RFC 5246, section 7.4.1.2, this should start with an unsigned 32-bit integer containing the Unix Epoch Time.
I am already getting stuck. The other protocols (HTTP, FTP, SMTP) use human readable text but I understand the Client Hello is in binary.
So I figured the first task was to turn the first 4 bytes into the 32 bit time value.
I wrote..
memcpy(&ClientHelloTime, buffer, 4);
which I thought should set ClientHelloTime to the current Unix Epoch Time, unfortunately it is not even close.
I happened to notice that if I copied starting 8 bytes into the buffer, I get a number that matches the first four digits of the Unix Epoc Time 1.375 x 10^9.
I'm stuck. I don't have confidence I'm reading the right RFC and I'm not confident about how I transferred bytes from the buffer into the unsigned integer – ClientHelloTime.
Any thoughts?
Thanks,
John
I already have a fine HTTP server and I've copied the code up to the point where a client sends their first message – the Client Hello. From RFC 5246, section 7.4.1.2, this should start with an unsigned 32-bit integer containing the Unix Epoch Time.
I am already getting stuck. The other protocols (HTTP, FTP, SMTP) use human readable text but I understand the Client Hello is in binary.
So I figured the first task was to turn the first 4 bytes into the 32 bit time value.
I wrote..
memcpy(&ClientHelloTime, buffer, 4);
which I thought should set ClientHelloTime to the current Unix Epoch Time, unfortunately it is not even close.
I happened to notice that if I copied starting 8 bytes into the buffer, I get a number that matches the first four digits of the Unix Epoc Time 1.375 x 10^9.
I'm stuck. I don't have confidence I'm reading the right RFC and I'm not confident about how I transferred bytes from the buffer into the unsigned integer – ClientHelloTime.
Any thoughts?
Thanks,
John