[09:20:51] <pmlopes> @ovestoerholt i think that your ngx_http_auth_request_module idea would be great for the awesome project since it would allow vert.x users that use ngnix to have a simple authN sollution that is handled at the proxy
[09:37:29] <ovestoerholt> @pmlopes I thought so too, and was kinda amazed that I could not find anything like it…
[09:38:07] <ovestoerholt> @pmlopes But I am not sure how to contribute, as I have never done so before…
[09:39:50] <pmlopes> the awesome project is very simple, you just make an update to the markdown file with a link to your project under: https://github.com/vert-x3/vertx-awesome#authentication-authorisation
[09:43:44] <ovestoerholt> Sounds not to hard! Will do some production testing first an then do the markup.
[14:08:54] *** ChanServ sets mode: +o temporal_
[15:33:32] <xkr47> hi
[15:33:43] <xkr47> is there some good way to “ignore” a request body?
[15:34:30] <xkr47> let's say I get a POST to an url with megabytes of data and I immediately want to respond 404 and ignore the body?
[15:35:26] <xkr47> should I a) register a dummy handler that does nothing b) call request.connection().close(); c) something else?
[16:38:53] <temporal_> xkr47 I think you need to do nothing by default
[16:38:57] <temporal_> you cannot ignore a body
[16:39:05] <temporal_> if you need something more conditional
[16:39:10] <temporal_> you can use Expect: 100 continue
[16:39:20] <temporal_> or if you use HTTP/2 you can reset the stream
[16:39:43] <temporal_> look for 100-Continue handling in vertx core doc
[17:42:28] <xkr47> this is a server implementation
[17:44:16] <xkr47> so the client might not send Expect: 100-continue
[18:31:04] <xkr47> but yeah the 100-continue documentation is good
[18:39:20] <temporal_> xkr47 I just wanted to point how it should be handled and that you cannot reset or close the connection if you want to respect HTTP/1.x
[18:39:35] <temporal_> and you should not send a response before receiving the request
[18:47:47] <xkr47> temporal_, so then I guess unless a Expect: 100-continue is present, I should do “a) register a dummy handler that does nothing” i.e just receives the data without doing anything with it..
[18:48:07] <temporal_> xkr47 exact!
[18:48:32] <xkr47> ok great
[18:48:44] <xkr47> thanks
[20:00:40] <xkr47> hmm actually it seems the data is received even if there is no data handler installed
[20:02:42] <xkr47> which also means I will lose data if I don't immediately register a handler when the request arrive
[20:02:45] <xkr47> s
[20:07:07] <xkr47> that also means if I want to save the request data to a file, and I open the file asynchronously - and, obviously - can't start streaming until the file has opened later, I might miss data from the beginning of the request
[20:08:17] <xkr47> this is an interesting implementation choice…
[20:09:35] <xkr47> so basically it (sounds to me that it) would be good to always call request.pause() the first thing you do when handling a new request, and call resume after starting the pump or whatever..
[20:17:33] <AlexLehm> xkr47: it depends on where you set the handler, if there is something async happening inbetween you have to pause the request
[20:17:48] <xkr47> yes
[20:18:18] <xkr47> I'm writing a replacement for RoutingContext which supports wrapping requests & responses kindof like you could in servlet filters
[20:18:54] <xkr47> and now I just decided I will call pause() the first thing I do and call resume() when someone calls handler() with a non-null value
[20:19:33] <AlexLehm> ok, that shoulds good
[20:22:11] <xkr47> would make sense to me to have this as normal behaviour..
[20:22:21] <xkr47> but I don't know all the decoder stuff yet so we'll see :)
[20:23:07] <xkr47> writing a tls-http2/1.1 proxy that proxies to a http/1.1 server
[20:23:25] <xkr47> with client certificate, oauth authentication etc support
[20:23:46] <xkr47> here: https://github.com/NitorCreations/nitor-backend/
[21:01:40] <chermah> hi guys
[21:01:44] <chermah> is there anyone here?
[21:01:48] <chermah> i need help
[21:02:50] <xkr47> I'm here but I'm not an expert :)