Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nacos-go-sdk 推送配置占用内存过大,导致oom #595

Closed
13262233692 opened this issue Mar 22, 2023 · 3 comments · Fixed by #596
Closed

nacos-go-sdk 推送配置占用内存过大,导致oom #595

13262233692 opened this issue Mar 22, 2023 · 3 comments · Fixed by #596

Comments

@13262233692
Copy link

容器内存4g
调用nacos-go-sdk
push配置的时候(配置不超过1m)
程序退出,进程直接被oom

@13262233692
Copy link
Author

程序日志
写入nacos配置前 , 执行命令查看结果
total used free shared buff/cache available
Mem: 4102900 731512 3006736 476 364652 3174304
Swap: 0 0 0
2023/03/22 08:18:33 nacos_client.go:64: [INFO] logDir:</tmp/nacos/log> cacheDir:</tmp/nacos/cache>
nacos 创建client , 执行命令查看结果
total used free shared buff/cache available
Mem: 4102900 731512 3006736 476 364652 3174304
Swap: 0 0 0
准备写入远程nacos , 执行命令查看结果
total used free shared buff/cache available
Mem: 4102900 731512 3006736 476 364652 3174304
Swap: 0 0 0
Killed

push代码
func (nacosConf *NacosConf) PushConfig(namepaceId,DataId,Group string,content *string) error{
client,err := nacosConf.GetNacosClient(namepaceId)
defer client.CloseClient()
if err != nil {
log.Error.Printf("nacos new client failed: %v ",err)
return err
}
osFree("free","准备写入远程nacos")
_, err = client.PublishConfig(vo.ConfigParam{
DataId: DataId,
Group: Group,
Content: *content,
})
osFree("free","远程nacos配置写入成功")

if err != nil {
	fmt.Printf("PublishConfig err:%+v \n", err)
	log.Error.Printf("推送nacos配置失败: %v ",err)
	return err
}
time.Sleep(1 * time.Second)
log.Info.Printf("推送nacos配置成功... ")
return nil

}

@13262233692
Copy link
Author

go-sdk 回退到2.1.3正常

@brucezo
Copy link

brucezo commented Mar 30, 2023

2.1.3也有很多坑

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants