Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Log Records getting skipped after rotation #1448

Open
Akhilesh53 opened this issue Jun 13, 2024 · 2 comments
Open

Log Records getting skipped after rotation #1448

Akhilesh53 opened this issue Jun 13, 2024 · 2 comments

Comments

@Akhilesh53
Copy link

Describe the bug
After rotation (creation of a new file) the very first log line is skipped and is not showing in log file.

To Reproduce
`func initiliaselogger() {
level := getLevel(env.ENV.LOG_LEVEL)

devConfig := zap.NewDevelopmentEncoderConfig()
prodConfig := zap.NewProductionEncoderConfig()

devConfig.EncodeTime = zapcore.RFC3339NanoTimeEncoder
prodConfig.EncodeTime = zapcore.RFC3339NanoTimeEncoder
hostname, err := os.Hostname()
if err != nil {
	hostname = ""
}
if hostname != "" {
	hostname = "_" + hostname
}
filewriter := zapcore.AddSync(&lumberjack.Logger{
	Filename:   "logs/regulatory/" + strings.ToLower(strings.ReplaceAll(env.ENV.PROCESS_NAME+hostname, " ", "_")) + ".log",
	MaxSize:    40,
	MaxAge:     30,
	MaxBackups: 100,
	Compress:   false, // disabled by default
})
core := zapcore.NewCore(zapcore.NewJSONEncoder(prodConfig), filewriter, level)

if env.ENV.ENVIRONMENT == "dev" {
	core = zapcore.NewTee(
		core,
		zapcore.NewCore(zapcore.NewConsoleEncoder(devConfig), zapcore.Lock(os.Stdout), level),
	)
}

if env.ENV.KAFKA_LOG == "Y" {
	kafkaSync := zapcore.AddSync(getKafkaWriter())
	core = zapcore.NewTee(
		core,
		zapcore.NewCore(zapcore.NewJSONEncoder(prodConfig), kafkaSync, level),
	)
}

Log = zap.New(core,
	zap.AddCaller(),
	zap.AddCallerSkip(1),
	zap.AddStacktrace(zap.ErrorLevel),
)

}`

Expected behavior
After the creation of a new file (after rotation), the very line of the log is skipped.

Pls let me know if I have to change anything in the configuration while initializing the logger.

@r-hang
Copy link
Contributor

r-hang commented Jun 18, 2024

Hey @Akhilesh53,

Would you be able to provide a reproducible example that we can run and debug locally?

@Akhilesh53
Copy link
Author

Hi @r-hang
I am using this logger configuration. I cannot share the log file that has production data. But what is happening is when a log is made of threshold size, and a new log file is created. Within this interval, a few logs are missed.

Means: Some parts of the same request are present in the previous file (completed log file) and some parts are in the newly created log file. But logs within this are missed.

maxLogFileSize = 40
maxLogFileAge = 30
maxLogFiles = 100

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants