-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Envoy Proxy L7 (with CA) OpenSSL error:SSL routines:ssl3_get_record:wrong version number #32567
Comments
Usually the Another thing I see wrong is that there are no port 53 rules to trigger DNS interception, which is also required for ToFQDN rules to work. Lastly, I recommend checking the |
Sorry for the delay in reply and thank you for reviewing my issue @squeed I left out other items in the - toPorts:
- ports:
- port: "53"
protocol: ANY
rules:
dns:
- matchPattern: "*" I now test with a policy that only has two (egress) rules, one for DNS and one for de L7 filtering. You mention that it's a prerequisite for FQDN to work properly. I did notice that when specifying the exact hostname in the DNS rule this does not work: - toPorts:
- ports:
- port: "53"
protocol: ANY
rules:
dns:
- matchPattern: "dummyjson.com" # or - matchName: dummyjson.com
# only this works
# - matchPattern: "*" cilium-envoy logs on the same node
I also notice cilium-envoy debug logs[2024-05-21 05:18:49.874][16][debug][router] [external/envoy/source/common/router/router.cc:1514] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] upstream headers complete: end_stream=false
[2024-05-21 05:18:49.874][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1803] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] closing connection due to connection close header
[2024-05-21 05:18:49.874][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1869] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] encoding headers via codec (end_stream=false):
':status', '200'
'content-type', 'text/plain; charset=UTF-8'
'cache-control', 'no-cache, max-age=0'
'x-content-type-options', 'nosniff'
'date', 'Tue, 21 May 2024 05:18:49 GMT'
'server', 'envoy'
'x-envoy-upstream-service-time', '0'
'connection', 'close'
[2024-05-21 05:18:49.874][16][debug][client] [external/envoy/source/common/http/codec_client.cc:128] [Tags: "ConnectionId":"15"] response complete
[2024-05-21 05:18:49.874][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1974] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] Codec completed encoding stream.
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:146] [Tags: "ConnectionId":"729"] closing data_to_write=277 type=0
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl_base.cc:47] [Tags: "ConnectionId":"729"] setting delayed close timer with timeout 1000 ms
[2024-05-21 05:18:49.874][16][debug][pool] [external/envoy/source/common/http/http1/conn_pool.cc:53] [Tags: "ConnectionId":"15"] response complete
[2024-05-21 05:18:49.874][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"15"] destroying stream: 0 remaining
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:788] [Tags: "ConnectionId":"729"] write flush complete
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"729"] closing socket: 1
[2024-05-21 05:18:49.874][16][debug][conn_handler] [external/envoy/source/common/listener_manager/active_stream_listener_base.cc:136] [Tags: "ConnectionId":"729"] adding to cleanup list
[2024-05-21 05:18:52.660][8][debug][main] [external/envoy/source/server/server.cc:239] flushing stats
[2024-05-21 05:18:53.421][15][debug][filter] [external/envoy/source/extensions/filters/listener/tls_inspector/tls_inspector.cc:137] tls:onServerName(), requestedServerName: pps-token.svc.tst.tkp
[2024-05-21 05:18:53.421][15][debug][misc] [cilium/bpf_metadata.cc:297] EGRESS POD IP: 172.x.x.x, destination IP: 172.x.x.x
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/conntrack.cc:178] cilium.bpf_metadata: Using conntrack map global
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 172.x.x.x has ID 16777219
[2024-05-21 05:18:53.421][15][debug][filter] [./cilium/socket_option.h:239] Cilium SocketOption(): source_identity: 285520, ingress: false, port: 443, pod_ip: 172.x.x.x, source_addresses: //, mark: 5b500b04 (magic mark: b00, cluster: 4, ID: 23376), proxy_id: 17881
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 172.x.x.x has ID 16777219
[2024-05-21 05:18:53.421][15][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/network_filter.cc:76] cilium.network: onNewConnection
[2024-05-21 05:18:53.422][15][debug][conn_handler] [external/envoy/source/common/listener_manager/active_tcp_listener.cc:160] [Tags: "ConnectionId":"730"] new connection from 172.x.x.x:35460
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:393] [Tags: "ConnectionId":"730"] new stream
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/filter_manager.cc:1065] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] Sending local reply with details http1.codec_error
[2024-05-21 05:18:53.422][15][debug][router] [cilium/uds_client.cc:56] Cilium access log resetting socket due to error: Broken pipe
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1803] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] closing connection due to connection close header
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1869] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] encoding headers via codec (end_stream=false):
':status', '400'
'content-length', '11'
'content-type', 'text/plain'
'date', 'Tue, 21 May 2024 05:18:53 GMT'
'server', 'envoy'
'connection', 'close'
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1974] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] Codec completed encoding stream.
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:247] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] doEndStream() resetting stream
[2024-05-21 05:18:53.423][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1938] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] stream reset: reset reason: local reset, response details: http1.codec_error
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:146] [Tags: "ConnectionId":"730"] closing data_to_write=156 type=2
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl_base.cc:47] [Tags: "ConnectionId":"730"] setting delayed close timer with timeout 1000 ms
[2024-05-21 05:18:53.423][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:439] [Tags: "ConnectionId":"730"] dispatch error: http/1.1 protocol error: HPE_INVALID_METHOD
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:146] [Tags: "ConnectionId":"730"] closing data_to_write=156 type=2
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:788] [Tags: "ConnectionId":"730"] write flush complete
[2024-05-21 05:18:53.453][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:788] [Tags: "ConnectionId":"730"] write flush complete
[2024-05-21 05:18:54.420][15][debug][connection] [external/envoy/source/common/network/connection_impl_base.cc:69] [Tags: "ConnectionId":"730"] triggered delayed close
[2024-05-21 05:18:54.420][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"730"] closing socket: 1
[2024-05-21 05:18:54.420][15][debug][conn_handler] [external/envoy/source/common/listener_manager/active_stream_listener_base.cc:136] [Tags: "ConnectionId":"730"] adding to cleanup list
[2024-05-21 05:18:57.660][8][debug][main] [external/envoy/source/server/server.cc:239] flushing stats My nodes are the AWS EKS managed AMIs with version These are the TCP packets I see with Hubble: Edit; hubble observe $ hubble observe --type trace:to-proxy
May 21 10:23:46.777: cilium-poc/client-7655c9f8d6-rktrm:56996 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:46.809: cilium-poc/client-7655c9f8d6-rktrm:56996 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 21 10:23:49.821: cilium-poc/client-7655c9f8d6-rktrm:51846 (ID:285520) -> kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:49.839: cilium-poc/client-7655c9f8d6-rktrm:51846 (ID:285520) <- kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:49.854: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: SYN)
May 21 10:23:49.854: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK)
May 21 10:23:49.858: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:49.901: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 21 10:23:52.909: cilium-poc/client-7655c9f8d6-rktrm:33220 (ID:285520) -> kube-system/coredns-66f55b6cf5-lwv7w:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:52.912: cilium-poc/client-7655c9f8d6-rktrm:33220 (ID:285520) <- kube-system/coredns-66f55b6cf5-lwv7w:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:52.921: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: SYN)
May 21 10:23:52.921: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK)
May 21 10:23:52.927: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:52.954: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 21 10:23:55.960: cilium-poc/client-7655c9f8d6-rktrm:56158 (ID:285520) -> kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:55.969: cilium-poc/client-7655c9f8d6-rktrm:56158 (ID:285520) <- kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:55.986: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: SYN)
May 21 10:23:55.986: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK)
May 21 10:23:55.991: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:56.020: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST) Nothing with the HTTP protocol $ hubble observe --protocol http --from-pod cilium-poc/client-7655c9f8d6-rktrm from the cilium-agent running on the same node:
Running |
Can you try a plaintext curl to port 443 instead? The version error is most likely because you're getting a plain text error message back since TLS is not configured correctly. Something like |
traffic going to-stack without http rule, which works fine (so we are able to connect from our pod when bypassing the envoy proxy)May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/kube-dns:53 (world) pre-xlate-fwd TRACED (UDP)
May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) post-xlate-fwd TRANSLATED (UDP)
May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) policy-verdict:L4-Only EGRESS ALLOWED (UDP)
May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-proxy FORWARDED (UDP)
May 22 05:21:28.569: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.569: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.570: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. AAAA)
May 22 05:21:28.570: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-endpoint FORWARDED (UDP)
May 22 05:21:28.570: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.571: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. A)
May 22 05:21:28.571: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.574: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.574: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.575: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. AAAA)
May 22 05:21:28.575: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.576: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. A)
May 22 05:21:28.576: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.578: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. A)
May 22 05:21:28.578: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.579: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. AAAA)
May 22 05:21:28.579: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.579: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.580: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.581: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. A)
May 22 05:21:28.581: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.582: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.583: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. AAAA)
May 22 05:21:28.583: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.583: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. AAAA)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. A)
May 22 05:21:28.588: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.588: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.595: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) policy-verdict:L3-L4 EGRESS ALLOWED (TCP Flags: SYN)
May 22 05:21:28.596: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: SYN)
May 22 05:21:28.733: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: ACK)
May 22 05:21:28.752: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: ACK, PSH)
May 22 05:21:29.062: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: ACK, FIN)
May 22 05:21:29.197: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: RST)
May 22 05:21:29.197: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: RST)
May 22 05:21:29.198: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: RST)
with http ruleMay 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/kube-dns:53 (world) pre-xlate-fwd TRACED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) post-xlate-fwd TRANSLATED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) policy-verdict:L4-Only EGRESS ALLOWED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) to-proxy FORWARDED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.349: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. A)
May 22 05:23:47.349: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) to-endpoint FORWARDED (UDP)
May 22 05:23:47.349: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.350: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.351: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. AAAA)
May 22 05:23:47.351: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. A)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. AAAA)
May 22 05:23:47.354: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. AAAA)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. A)
May 22 05:23:47.357: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.357: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. A)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.359: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. AAAA)
May 22 05:23:47.359: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.361: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.361: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.361: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. A)
May 22 05:23:47.362: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. AAAA)
May 22 05:23:47.362: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.362: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) policy-verdict:L3-L4 EGRESS ALLOWED (TCP Flags: SYN)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: SYN)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (TCP)
May 22 05:23:47.369: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 22 05:23:47.416: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 22 05:23:47.954: cilium-poc/client-7655c9f8d6-2h7gm:42846 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: RST) Trying with http://:443 (with http rule enabled so traffic goes trough the proxy)/ # curl -v --max-time 5 http://dummyjson.com:443/products/1
* Host dummyjson.com:443 was resolved.
* IPv6: (none)
* IPv4: 104.196.232.237, 104.196.232.237
* Trying 104.196.232.237:443...
* Connected to dummyjson.com (104.196.232.237) port 443
> GET /products/1 HTTP/1.1
> Host: dummyjson.com:443
> User-Agent: curl/8.5.0
> Accept: */*
>
< HTTP/1.1 503 Service Unavailable
< content-length: 118
< content-type: text/plain
< date: Wed, 22 May 2024 05:24:40 GMT
< server: envoy
<
* Connection #0 to host dummyjson.com left intact
upstream connect error or disconnect/reset before headers. retried and the latest reset reason: connection termination/ Hubble flows for the http://:443 callMay 22 05:34:37.485: cilium-poc/client-74987657df-4t9jr (ID:296361) <> kube-system/kube-dns:53 (world) pre-xlate-fwd TRACED (UDP)
May 22 05:34:37.485: cilium-poc/client-74987657df-4t9jr (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) post-xlate-fwd TRANSLATED (UDP)
May 22 05:34:37.486: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) policy-verdict:L4-Only EGRESS ALLOWED (UDP)
May 22 05:34:37.486: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-proxy FORWARDED (UDP)
May 22 05:34:37.487: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.494: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.505: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. A)
May 22 05:34:37.505: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. AAAA)
May 22 05:34:37.510: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.510: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.510: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-endpoint FORWARDED (UDP)
May 22 05:34:37.518: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.518: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.519: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. AAAA)
May 22 05:34:37.519: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.519: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. A)
May 22 05:34:37.521: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. AAAA)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.523: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. A)
May 22 05:34:37.524: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.525: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. AAAA)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. A)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.527: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.530: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.530: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.530: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. A)
May 22 05:34:37.531: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. AAAA)
May 22 05:34:37.531: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.531: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) policy-verdict:L3-L4 EGRESS ALLOWED (TCP Flags: SYN)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: SYN)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (TCP)
May 22 05:34:37.542: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 22 05:34:37.725: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.007: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.319: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.649: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.788: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, FIN)
May 22 05:34:38.788: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK) Debug logs of the envoy-proxy for the http://:443 call[2024-05-22 06:12:42.629][14][trace][filter] [cilium/network_filter.cc:99] [Tags: "ConnectionId":"4225"] cilium.network: SNI: dummyjson.com related to the call [2024-05-22 05:28:04.102][16][trace][filter] [external/envoy/source/extensions/filters/listener/tls_inspector/tls_inspector.cc:106] tls inspector: new connection accepted
[2024-05-22 05:28:04.103][16][trace][misc] [external/envoy/source/common/network/tcp_listener_impl.cc:116] TcpListener accepted 1 new connections.
[2024-05-22 05:28:04.103][16][trace][filter] [external/envoy/source/common/network/listener_filter_buffer_impl.cc:95] onFileEvent: 1
[2024-05-22 05:28:04.103][16][trace][filter] [external/envoy/source/common/network/listener_filter_buffer_impl.cc:60] recv returned: 90
[2024-05-22 05:28:04.104][16][trace][filter] [external/envoy/source/extensions/filters/listener/tls_inspector/tls_inspector.cc:146] tls inspector: recv: 90
[2024-05-22 05:28:04.105][16][debug][misc] [cilium/bpf_metadata.cc:297] EGRESS POD IP: 172.x.x.x, destination IP: 104.196.232.237
[2024-05-22 05:28:04.105][16][debug][filter] [cilium/conntrack.cc:178] cilium.bpf_metadata: Using conntrack map global
[2024-05-22 05:28:04.105][16][trace][filter] [cilium/conntrack.cc:197] cilium.bpf_metadata: Looking up key: 68c4e8ed, ac1f756f, 1bb, c7ea, 6, 0
[2024-05-22 05:28:04.106][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.107][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.107][16][debug][filter] [./cilium/socket_option.h:239] Cilium SocketOption(): source_identity: 296361, ingress: false, port: 443, pod_ip: 172.x.x.x, source_addresses: //, mark: 85a90b04 (magic mark: b00, cluster: 4, ID: 34217), proxy_id: 0
[2024-05-22 05:28:04.107][16][trace][misc] [external/envoy/source/common/event/scaled_range_timer_manager_impl.cc:60] enableTimer called on 0x6e37f4968c0 for 3600000ms, min is 3600000ms
[2024-05-22 05:28:04.107][16][debug][filter] [cilium/network_filter.cc:76] cilium.network: onNewConnection
[2024-05-22 05:28:04.107][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4015"] raising connection event 2
[2024-05-22 05:28:04.107][16][debug][conn_handler] [external/envoy/source/extensions/listener_managers/listener_manager/active_tcp_listener.cc:159] [Tags: "ConnectionId":"4015"] new connection from 172.x.x.x:51178
[2024-05-22 05:28:04.108][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:04.108][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=1)
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4015"] socket event: 3
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4015"] write ready
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4015"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4015"] read returns: 90
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:39] [Tags: "ConnectionId":"4015"] read error: Resource temporarily unavailable, code: 0
[2024-05-22 05:28:04.108][16][trace][filter] [cilium/network_filter.cc:196] cilium.network: onData 90 bytes, end_stream: false
[2024-05-22 05:28:04.108][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4015"] parsing 90 bytes
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:590] [Tags: "ConnectionId":"4015"] message begin
[2024-05-22 05:28:04.109][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:391] [Tags: "ConnectionId":"4015"] new stream
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:547] [Tags: "ConnectionId":"4015"] completed header: key=Host value=dummyjson.com:443
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:547] [Tags: "ConnectionId":"4015"] completed header: key=User-Agent value=curl/8.5.0
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:841] [Tags: "ConnectionId":"4015"] onHeadersCompleteImpl
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:547] [Tags: "ConnectionId":"4015"] completed header: key=Accept value=*/*
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:1195] [Tags: "ConnectionId":"4015"] Server: onHeadersComplete size=3
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:945] [Tags: "ConnectionId":"4015"] message complete
[2024-05-22 05:28:04.109][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1194] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] request headers complete (end_stream=true):
':authority', 'dummyjson.com:443'
':path', '/products/1'
':method', 'GET'
'user-agent', 'curl/8.5.0'
'accept', '*/*'
[2024-05-22 05:28:04.109][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1177] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] request end stream
[2024-05-22 05:28:04.110][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4015"] current connecting state: false
[2024-05-22 05:28:04.111][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=cilium.l7policy status=0
[2024-05-22 05:28:04.111][16][debug][router] [external/envoy/source/common/router/router.cc:520] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] cluster 'egress-cluster' match for URL '/products/1'
[2024-05-22 05:28:04.111][16][debug][upstream] [external/envoy/source/common/upstream/upstream_impl.cc:426] transport socket match, socket default selected for host with address 104.196.232.237:443
[2024-05-22 05:28:04.111][16][debug][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:81] Created host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.111][16][trace][filter] [cilium/network_filter.cc:123] cilium.network: in upstream callback
[2024-05-22 05:28:04.111][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.118][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.118][16][debug][router] [external/envoy/source/common/router/router.cc:740] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] router decoding headers:
':authority', 'dummyjson.com:443'
':path', '/products/1'
':method', 'GET'
':scheme', 'http'
'user-agent', 'curl/8.5.0'
'accept', '*/*'
'x-forwarded-proto', 'http'
'x-envoy-internal', 'true'
'x-request-id', '7515c927-a261-4f73-a9e2-b3d18f3f6490'
'x-envoy-expected-rq-timeout-ms', '3600000'
[2024-05-22 05:28:04.119][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:04.119][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:04.119][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:04.119][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:04.118][7][debug][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:231] addHost() adding egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.121][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:04.121][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src:
[2024-05-22 05:28:04.121][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4016"] current connecting state: true
[2024-05-22 05:28:04.121][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4016"] connecting
[2024-05-22 05:28:04.121][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4016"] connecting to 104.196.232.237:443
[2024-05-22 05:28:04.121][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4016"] connection in progress
[2024-05-22 05:28:04.121][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:04.122][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:04.122][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.router status=1
[2024-05-22 05:28:04.122][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4015"] parsed 90 bytes
[2024-05-22 05:28:04.125][7][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1474] membership update for TLS cluster egress-cluster added 1 removed 0
[2024-05-22 05:28:04.125][14][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1474] membership update for TLS cluster egress-cluster added 1 removed 0
[2024-05-22 05:28:04.126][14][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1480] re-creating local LB for TLS cluster egress-cluster
[2024-05-22 05:28:04.126][7][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1480] re-creating local LB for TLS cluster egress-cluster
[2024-05-22 05:28:04.125][16][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1474] membership update for TLS cluster egress-cluster added 1 removed 0
[2024-05-22 05:28:04.127][16][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1480] re-creating local LB for TLS cluster egress-cluster
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4016"] socket event: 2
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.258][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4016"] connected
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4016"] raising connection event 2
[2024-05-22 05:28:04.258][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4016"] connected
[2024-05-22 05:28:04.258][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4016"] attaching to next stream
[2024-05-22 05:28:04.258][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4016"] creating stream
[2024-05-22 05:28:04.258][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:04.258][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.258][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.258][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:04.258][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:04.258][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:04.258][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4016"] writing 232 bytes, end_stream false
[2024-05-22 05:28:04.258][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4016"] encode complete
[2024-05-22 05:28:04.258][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4016"] write returns: 232
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4016"] socket event: 2
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.395][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4016"] socket event: 3
[2024-05-22 05:28:04.395][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.396][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4016"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:04.396][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4016"] read returns: 0
[2024-05-22 05:28:04.397][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4016"] remote close
[2024-05-22 05:28:04.399][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4016"] closing socket: 0
[2024-05-22 05:28:04.401][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4016"] raising connection event 0
[2024-05-22 05:28:04.402][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4016"] parsing 0 bytes
[2024-05-22 05:28:04.402][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4016"] parsed 0 bytes
[2024-05-22 05:28:04.403][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4016"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:04.404][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4016"] request reset
[2024-05-22 05:28:04.405][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:04.406][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason:
[2024-05-22 05:28:04.406][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:04.407][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:04.408][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4016"] client disconnected, failure reason:
[2024-05-22 05:28:04.408][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:04.409][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:04.409][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:04.410][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:04.411][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:04.411][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:04.411][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=6)
[2024-05-22 05:28:04.411][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4016"] destroying stream: 0 remaining
[2024-05-22 05:28:04.425][16][debug][router] [external/envoy/source/common/router/router.cc:1912] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] performing retry
[2024-05-22 05:28:04.425][16][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:65] Using existing host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.425][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:04.426][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:04.426][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:04.426][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:04.426][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:04.426][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src:
[2024-05-22 05:28:04.427][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4017"] current connecting state: true
[2024-05-22 05:28:04.427][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4017"] connecting
[2024-05-22 05:28:04.427][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4017"] connecting to 104.196.232.237:443
[2024-05-22 05:28:04.429][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4017"] connection in progress
[2024-05-22 05:28:04.429][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:04.429][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:04.565][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4017"] socket event: 2
[2024-05-22 05:28:04.565][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.565][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4017"] connected
[2024-05-22 05:28:04.565][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4017"] raising connection event 2
[2024-05-22 05:28:04.565][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4017"] connected
[2024-05-22 05:28:04.565][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4017"] attaching to next stream
[2024-05-22 05:28:04.565][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4017"] creating stream
[2024-05-22 05:28:04.565][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:04.565][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.567][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.567][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:04.567][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:04.567][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:04.568][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:04.569][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4017"] writing 232 bytes, end_stream false
[2024-05-22 05:28:04.569][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4017"] encode complete
[2024-05-22 05:28:04.569][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:04.569][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.570][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4017"] write returns: 232
[2024-05-22 05:28:04.570][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4017"] socket event: 2
[2024-05-22 05:28:04.570][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.706][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4017"] socket event: 3
[2024-05-22 05:28:04.707][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.707][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4017"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:04.707][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4017"] read returns: 0
[2024-05-22 05:28:04.708][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4017"] remote close
[2024-05-22 05:28:04.711][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4017"] closing socket: 0
[2024-05-22 05:28:04.711][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4017"] raising connection event 0
[2024-05-22 05:28:04.712][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4017"] parsing 0 bytes
[2024-05-22 05:28:04.713][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4017"] parsed 0 bytes
[2024-05-22 05:28:04.714][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4017"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:04.715][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4017"] request reset
[2024-05-22 05:28:04.715][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:04.722][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason:
[2024-05-22 05:28:04.722][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:04.723][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4017"] client disconnected, failure reason:
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:04.723][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:04.723][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:04.723][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=6)
[2024-05-22 05:28:04.723][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4017"] destroying stream: 0 remaining
[2024-05-22 05:28:04.743][16][debug][router] [external/envoy/source/common/router/router.cc:1912] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] performing retry
[2024-05-22 05:28:04.743][16][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:65] Using existing host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.743][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:04.743][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:04.743][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:04.743][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:04.743][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:04.743][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src:
[2024-05-22 05:28:04.743][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4018"] current connecting state: true
[2024-05-22 05:28:04.743][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4018"] connecting
[2024-05-22 05:28:04.743][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4018"] connecting to 104.196.232.237:443
[2024-05-22 05:28:04.743][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4018"] connection in progress
[2024-05-22 05:28:04.744][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:04.744][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:04.881][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4018"] socket event: 2
[2024-05-22 05:28:04.881][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:04.881][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4018"] connected
[2024-05-22 05:28:04.881][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4018"] raising connection event 2
[2024-05-22 05:28:04.881][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4018"] connected
[2024-05-22 05:28:04.881][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4018"] attaching to next stream
[2024-05-22 05:28:04.881][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4018"] creating stream
[2024-05-22 05:28:04.881][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:04.881][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.882][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.882][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:04.883][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:04.883][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:04.885][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:04.885][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4018"] writing 232 bytes, end_stream false
[2024-05-22 05:28:04.885][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4018"] encode complete
[2024-05-22 05:28:04.886][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4018"] write returns: 232
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4018"] socket event: 2
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4018"] socket event: 3
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4018"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4018"] read returns: 0
[2024-05-22 05:28:05.023][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4018"] remote close
[2024-05-22 05:28:05.023][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4018"] closing socket: 0
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4018"] raising connection event 0
[2024-05-22 05:28:05.023][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4018"] parsing 0 bytes
[2024-05-22 05:28:05.023][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4018"] parsed 0 bytes
[2024-05-22 05:28:05.023][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4018"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:05.024][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4018"] request reset
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:05.024][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason:
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:05.024][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4018"] client disconnected, failure reason:
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:05.024][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:05.024][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:05.024][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=6)
[2024-05-22 05:28:05.024][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4018"] destroying stream: 0 remaining
[2024-05-22 05:28:05.061][16][debug][router] [external/envoy/source/common/router/router.cc:1912] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] performing retry
[2024-05-22 05:28:05.061][16][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:65] Using existing host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:05.063][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:05.063][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:05.063][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:05.063][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:05.064][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:05.065][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src:
[2024-05-22 05:28:05.065][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4019"] current connecting state: true
[2024-05-22 05:28:05.065][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4019"] connecting
[2024-05-22 05:28:05.065][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4019"] connecting to 104.196.232.237:443
[2024-05-22 05:28:05.065][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4019"] connection in progress
[2024-05-22 05:28:05.065][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:05.065][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:05.203][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4019"] socket event: 2
[2024-05-22 05:28:05.204][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.205][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4019"] connected
[2024-05-22 05:28:05.206][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4019"] raising connection event 2
[2024-05-22 05:28:05.206][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4019"] connected
[2024-05-22 05:28:05.207][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4019"] attaching to next stream
[2024-05-22 05:28:05.208][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4019"] creating stream
[2024-05-22 05:28:05.209][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:05.209][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:05.210][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:05.211][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:05.212][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:05.212][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:05.214][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:05.215][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4019"] writing 232 bytes, end_stream false
[2024-05-22 05:28:05.216][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4019"] encode complete
[2024-05-22 05:28:05.216][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:05.216][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.217][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4019"] write returns: 232
[2024-05-22 05:28:05.217][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4019"] socket event: 2
[2024-05-22 05:28:05.217][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4019"] socket event: 3
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4019"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4019"] read returns: 0
[2024-05-22 05:28:05.354][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4019"] remote close
[2024-05-22 05:28:05.354][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4019"] closing socket: 0
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4019"] raising connection event 0
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4019"] parsing 0 bytes
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4019"] parsed 0 bytes
[2024-05-22 05:28:05.354][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4019"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:05.354][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4019"] request reset
[2024-05-22 05:28:05.354][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:05.354][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason:
[2024-05-22 05:28:05.354][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:05.354][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:05.354][16][debug][http] [external/envoy/source/common/http/filter_manager.cc:1035] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] Sending local reply with details upstream_reset_before_response_started{connection_termination}
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:1208] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encode headers called: filter=cilium.l7policy status=0
[2024-05-22 05:28:05.354][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1863] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encoding headers via codec (end_stream=false):
':status', '503'
'content-length', '118'
'content-type', 'text/plain'
'date', 'Wed, 22 May 2024 05:28:04 GMT'
'server', 'envoy'
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4015"] writing 135 bytes, end_stream false
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:1393] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encode data called: filter=cilium.l7policy status=0
[2024-05-22 05:28:05.355][16][trace][http] [external/envoy/source/common/http/conn_manager_impl.cc:1886] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encoding data via codec (size=118 end_stream=true)
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4015"] writing 118 bytes, end_stream false
[2024-05-22 05:28:05.355][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1968] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] Codec completed encoding stream.
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:05.355][16][trace][misc] [external/envoy/source/common/event/scaled_range_timer_manager_impl.cc:60] enableTimer called on 0x6e37f4968c0 for 3600000ms, min is 3600000ms
[2024-05-22 05:28:05.355][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4019"] client disconnected, failure reason:
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=7)
[2024-05-22 05:28:05.355][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:05.355][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=8)
[2024-05-22 05:28:05.355][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=8)
[2024-05-22 05:28:05.355][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4019"] destroying stream: 0 remaining
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4015"] socket event: 2
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4015"] write ready
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4015"] write returns: 253
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4015"] socket event: 3
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4015"] write ready
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4015"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4015"] read returns: 0
[2024-05-22 05:28:05.360][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4015"] remote close
[2024-05-22 05:28:05.360][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4015"] closing socket: 0
[2024-05-22 05:28:05.360][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4015"] raising connection event 0
[2024-05-22 05:28:05.360][16][trace][conn_handler] [external/envoy/source/extensions/listener_managers/listener_manager/active_stream_listener_base.cc:125] [Tags: "ConnectionId":"4015"] tcp connection on event 0
[2024-05-22 05:28:05.360][16][debug][conn_handler] [external/envoy/source/extensions/listener_managers/listener_manager/active_stream_listener_base.cc:135] [Tags: "ConnectionId":"4015"] adding to cleanup list
[2024-05-22 05:28:05.360][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:05.360][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:05.360][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=2)
[2024-05-22 05:28:05.759][7][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:250] Cleaning up stale original dst hosts.
[2024-05-22 05:28:05.759][7][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:289] Keeping active address 104.196.232.237:443.
[2024-05-22 05:28:06.626][7][debug][main] [external/envoy/source/server/server.cc:263] flushing stats
The multiple DNS lookups because the Edit; Noticeable is the I agree this is a misconfiguration for TLS somewhere in the envoy proxy, but that's what the cilium agents needs to set using my the complete CiliumNetworkPolicyapiVersion: "cilium.io/v2"
kind: CiliumNetworkPolicy
metadata:
name: "client"
namespace: cilium-poc
spec:
endpointSelector:
matchLabels:
"k8s:app": client
egress:
- toPorts:
- ports:
- port: "53"
protocol: ANY
rules:
dns:
- matchPattern: "*"
- toFQDNs:
- matchName: "dummyjson.com"
toPorts:
- ports:
- port: '443'
protocol: TCP
rules:
http:
- {} |
I also see a lot of these $ k logs cilium-envoy-8cbsd -n kube-system -f | grep -i 'cilium.tls'
[2024-05-22 05:24:27.535][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:30.602][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:33.688][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:35.845][16][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:36.758][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket coming from https://github.com/cilium/proxy/blob/main/cilium/tls_wrapper.cc#L66-69 |
What does your |
@lmb it's empty currently. I tried setting it earlier, but perhaps cilium did not clean up this configuration from envoy? I already tried re-adding the policy and restarting every component afterwards. Are there more details I can provide? I have two use cases which I need to validate on our cluster which are working in the labs:
These are the corresponding envoy logs for this call / # curl -v --max-time 30 https://dummyjson.com/products/1
* Host dummyjson.com:443 was resolved.
* IPv6: (none)
* IPv4: 104.196.232.237, 104.196.232.237
* Trying 104.196.232.237:443...
* Connected to dummyjson.com (104.196.232.237) port 443
* ALPN: curl offers h2,http/1.1
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
* CAfile: /etc/ssl/certs/ca-certificates.crt
* CApath: /etc/ssl/certs
* OpenSSL/3.1.4: error:0A00010B:SSL routines::wrong version number
* Closing connection
curl: (35) OpenSSL/3.1.4: error:0A00010B:SSL routines::wrong version number
[2024-05-22 09:41:56.311][14][debug][misc] [cilium/bpf_metadata.cc:297] EGRESS POD IP: 172.x.x.x, destination IP: 104.196.232.237
[2024-05-22 09:41:56.311][14][debug][filter] [cilium/conntrack.cc:178] cilium.bpf_metadata: Using conntrack map global
[2024-05-22 09:41:56.311][14][trace][filter] [cilium/conntrack.cc:197] cilium.bpf_metadata: Looking up key: 68c4e8ed, ac1f711c, 1bb, b23c, 6, 0
[2024-05-22 09:41:56.312][14][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 09:41:56.312][14][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777218
[2024-05-22 09:41:56.313][14][debug][filter] [./cilium/socket_option.h:239] Cilium SocketOption(): source_identity: 291852, ingress: false, port: 443, pod_ip: 172.x.x.x, source_addresses: //, mark: 740c0b04 (magic mark: b00, cluster: 4, ID: 29708), proxy_id: 0
[2024-05-22 09:41:56.313][14][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 09:41:56.314][14][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777218
[2024-05-22 09:41:56.314][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 09:41:56.315][14][debug][filter] [cilium/network_filter.cc:76] cilium.network: onNewConnection
[2024-05-22 09:41:56.315][14][trace][filter] [cilium/network_filter.cc:99] [Tags: "ConnectionId":"63"] cilium.network: SNI: dummyjson.com
[2024-05-22 09:41:56.322][14][trace][filter] [cilium/network_filter.cc:196] cilium.network: onData 517 bytes, end_stream: false
[2024-05-22 09:41:56.323][14][debug][router] [cilium/uds_client.cc:56] Cilium access log resetting socket due to error: Connection reset by peer
[2024-05-22 09:41:56.324][14][trace][http] [external/envoy/source/common/http/filter_manager.cc:1208] [Tags: "ConnectionId":"63","StreamId":"7639742152903267136"] encode headers called: filter=cilium.l7policy status=0
[2024-05-22 09:41:56.326][14][trace][http] [external/envoy/source/common/http/filter_manager.cc:1393] [Tags: "ConnectionId":"63","StreamId":"7639742152903267136"] encode data called: filter=cilium.l7policy status=0 From the / # curl -v --max-time 30 http://dummyjson.com:443/products/1
* Host dummyjson.com:443 was resolved.
* IPv6: (none)
* IPv4: 104.196.232.237, 104.196.232.237
* Trying 104.196.232.237:443...
* Connected to dummyjson.com (104.196.232.237) port 443
> GET /products/1 HTTP/1.1
> Host: dummyjson.com:443
> User-Agent: curl/8.5.0
> Accept: */*
>
< HTTP/1.1 503 Service Unavailable
< content-length: 118
< content-type: text/plain
< date: Wed, 22 May 2024 09:43:19 GMT
< server: envoy
<
* Connection #0 to host dummyjson.com left intact
upstream connect error or disconnect/reset before headers. retried and the latest reset reason: connection termination/ |
The problem is that envoy can't match your incoming request with a server-side TLS certificate + key to use (this is the That said, this seems to me like a configuration / troubleshooting issue and not a bug with Cilium. The issue tracker is not the right venue for this. Either try the Slack or consider an enterprise vendor: https://cilium.io/enterprise/ |
@lmb thank you very much for your time and assistance with the issue. The good thing is that I have managed to get the It would be really great to not having to add the
Moreover, when both Once again thank you for your time and effort. |
@sjoukedv have you tried using |
Thank you for the suggestion. I gave this a try, but this I reckon this also sends the traffic through the envoy proxy. For a L4 (
I have managed get this to work only with enabling the TLS termination as well as specifying the CA for the origin: egress:
- toFQDNs:
- matchName: my-service.tld
toPorts:
- ports:
- port: '443'
protocol: TCP
terminatingTLS:
secret:
namespace: "kube-system"
name: "my-service-tls" # key and cert that matches the hostname(s) of the FQDN
originatingTLS:
secret:
namespace: "cilium-secrets"
name: "my-service-ca" # CA not for the termination above, but for connecting to the 'origin'
serverNames: # optional, also works without
- "my-service.tld"
rules:
http:
- {} I guess the curl -v -H "Host: other-service.tld" https://my-service.tld But... this is where I notice the above request also succeeds while it is not in the list of allowed egress FQDNs. |
I don't think you need |
have a look at the example from here: https://isovalent.com/blog/post/zero-trust-security-with-cilium/#h-tls-sni |
@networkop thank you for getting back to me. I have seen the example you shared and that is also what I tried:
which works equally well with or without the I would expect when specifying the |
oh, if you need to actually do L7 (HTTP) policies, then yes, you'd need originating/terminatingTLS.
|
Is there an existing issue for this?
What happened?
I am trying to enforce L7 policies on an external endpoint and adding our own CA (as in this example)
The above FQDN has a valid certificate (it's a public API), but I also cannot get this to work with something that has a self-signed certificate and adding:
Removing the
rules.http
section makes it work as traffic no longer goes through the envoy proxy (?)Curl indicates the problem is with the openssl library
confirmed with directly using OpenSSL
Please let me know what further information I can provide.
Cilium Version
1.16.0-pre.2
Kernel Version
Linux ip-172-x-x-x.<region>.compute.internal 5.10.215-203.850.amzn2.x86_64 #1 SMP Tue Apr 23 20:32:19 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Kubernetes Version
Server Version: v1.29.3-eks-adc7111
Regression
No response
Sysdump
Can provide on request in private.
Relevant log output
No response
Anything else?
helm values
Cilium Users Document
Code of Conduct
The text was updated successfully, but these errors were encountered: