Integrating rsyslog to logstash to include only fortinet syslog

I have integrated rsyslog with logstash but cannot see the output in a standard output
there are no error logs
This is the logstash conf t

input {
  udp {
    host => "10.200.253.122"
    port => 5140
    type => "fortinet"
  }
}


output {
  stdout {}
}

and this is the logs from

 /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/logstash-fortinet.conf
Using bundled JDK: /usr/share/logstash/jdk
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2025-06-04 19:18:11.184 [main] runner - Starting from version 9.0, running with superuser privileges is not permitted unless you explicitly set 'allow_superuser' to true, thereby acknowledging the possible security risks
[WARN ] 2025-06-04 19:18:11.196 [main] runner - NOTICE: Running Logstash as a superuser is strongly discouraged as it poses a security risk. Set 'allow_superuser' to false for better security.
[WARN ] 2025-06-04 19:18:11.206 [main] runner - 'pipeline.buffer.type' setting is not explicitly defined.Before moving to 9.x set it to 'heap' and tune heap size upward, or set it to 'direct' to maintain existing behavior.
[INFO ] 2025-06-04 19:18:11.207 [main] runner - Starting Logstash {"logstash.version"=>"8.17.3", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.6+7-LTS on 21.0.6+7-LTS +indy +jit [x86_64-linux]"}
[INFO ] 2025-06-04 19:18:11.218 [main] runner - JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[INFO ] 2025-06-04 19:18:11.406 [main] StreamReadConstraintsUtil - Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[INFO ] 2025-06-04 19:18:11.406 [main] StreamReadConstraintsUtil - Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[WARN ] 2025-06-04 19:18:11.602 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2025-06-04 19:18:12.379 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[INFO ] 2025-06-04 19:18:12.789 [Converge PipelineAction::Create<main>] Reflections - Reflections took 153 ms to scan 1 urls, producing 152 keys and 530 values
[INFO ] 2025-06-04 19:18:13.014 [Converge PipelineAction::Create<main>] javapipeline - Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[INFO ] 2025-06-04 19:18:13.052 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/logstash-fortinet.conf"], :thread=>"#<Thread:0x16caf50c /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[INFO ] 2025-06-04 19:18:13.740 [[main]-pipeline-manager] javapipeline - Pipeline Java execution initialization time {"seconds"=>0.69}
[INFO ] 2025-06-04 19:18:13.765 [[main]-pipeline-manager] javapipeline - Pipeline started {"pipeline.id"=>"main"}
[INFO ] 2025-06-04 19:18:13.782 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2025-06-04 19:18:13.837 [[main]<udp] udp - Starting UDP listener {:address=>"10.200.253.122:5140"}
[INFO ] 2025-06-04 19:18:13.845 [[main]<udp] udp - UDP listener started {:address=>"10.200.253.122:5140", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}

is the rsyslog forwarding messages to 10.200.253.122:5140 using UDP or the clients?

Check what dot-mike said.

[INFO ] 2025-06-04 19:18:13.845 [[main]<udp] udp - UDP listener started {:address=>"10.200.253.122:5140", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}

LS listener has been started, you have to enable the network communication or at least open local port 5140 if you have active firewall.

You can use TCPdump to capture any traffic.

Hi,

I think that is the issue which I am trying to solve

Thanks

1 Like