Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug][Sonarcube] Data too long for column 'component' at row 12 #7852

Closed
2 of 3 tasks
Capone1983 opened this issue Aug 6, 2024 · 1 comment · Fixed by #7885
Closed
2 of 3 tasks

[Bug][Sonarcube] Data too long for column 'component' at row 12 #7852

Capone1983 opened this issue Aug 6, 2024 · 1 comment · Fixed by #7885
Labels
pr-type/bug-fix This PR fixes a bug type/bug This issue is a bug

Comments

@Capone1983
Copy link

Search before asking

  • I had searched in the issues and found no similar issues.

What happened

SonarCube scan fails the pipeline scan of a project.

attached stack trace -- stack trace: | github.com/apache/incubator-devlake/server/services.runPipeline | /app/server/services/pipeline_runner.go:79 | [...repeated from below...] Wraps: (2) Error running pipeline 14. Wraps: (3) attached stack trace -- stack trace: | github.com/apache/incubator-devlake/server/services.RunTasksStandalone | /app/server/services/task.go:217 | github.com/apache/incubator-devlake/server/services.(*pipelineRunner).runPipelineStandalone.func1 | /app/server/services/pipeline_runner.go:42 | github.com/apache/incubator-devlake/core/runner.runPipelineTasks | /app/core/runner/run_pipeline.go:90 | github.com/apache/incubator-devlake/core/runner.RunPipeline | /app/core/runner/run_pipeline.go:54 | github.com/apache/incubator-devlake/server/services.(*pipelineRunner).runPipelineStandalone | /app/server/services/pipeline_runner.go:38 | github.com/apache/incubator-devlake/server/services.runPipeline | /app/server/services/pipeline_runner.go:76 | github.com/apache/incubator-devlake/server/services.RunPipelineInQueue.func1 | /app/server/services/pipeline.go:347 | runtime.goexit | /usr/local/go/src/runtime/asm_amd64.s:1598 Wraps: (4) attached stack trace | -- stack trace: | | github.com/apache/incubator-devlake/server/services.RunTasksStandalone.func1 | | /app/server/services/task.go:189 | Wraps: (2) Error running task 870. | Wraps: (3) attached stack trace | -- stack trace: | | github.com/apache/incubator-devlake/core/runner.RunPluginSubTasks | | /app/core/runner/run_task.go:329 | | [...repeated from below...] | Wraps: (4) subtask convertIssueCodeBlocks ended unexpectedly | Wraps: (5) attached stack trace | -- stack trace: | | github.com/apache/incubator-devlake/impls/dalgorm.(*Dalgorm).convertGormError | | /app/impls/dalgorm/dalgorm.go:523 | | github.com/apache/incubator-devlake/impls/dalgorm.(*Dalgorm).CreateOrUpdate | | /app/impls/dalgorm/dalgorm.go:265 | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*BatchSave).flushWithoutLocking | | /app/helpers/pluginhelper/api/batch_save.go:131 | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*BatchSave).Close | | /app/helpers/pluginhelper/api/batch_save.go:147 | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*BatchSaveDivider).Close | | /app/helpers/pluginhelper/api/batch_save_divider.go:102 | | github.com/apache/incubator-devlake/helpers/pluginhelper/api.(*DataConverter).Execute | | /app/helpers/pluginhelper/api/data_convertor.go:131 | | github.com/apache/incubator-devlake/plugins/sonarqube/tasks.ConvertIssueCodeBlocks | | /app/plugins/sonarqube/tasks/issue_code_blocks_convertor.go:77 | | github.com/apache/incubator-devlake/core/runner.runSubtask | | /app/core/runner/run_task.go:405 | | github.com/apache/incubator-devlake/core/runner.RunPluginSubTasks | | /app/core/runner/run_task.go:327 | | github.com/apache/incubator-devlake/core/runner.RunPluginTask | | /app/core/runner/run_task.go:163 | | github.com/apache/incubator-devlake/core/runner.RunTask | | /app/core/runner/run_task.go:137 | | github.com/apache/incubator-devlake/server/services.runTaskStandalone | | /app/server/services/task_runner.go:113 | | github.com/apache/incubator-devlake/server/services.RunTasksStandalone.func1 | | /app/server/services/task.go:187 | | runtime.goexit | | /usr/local/go/src/runtime/asm_amd64.s:1598 | Wraps: (6) Error 1406 (22001): Data too long for column 'component' at row 12 (500) | Wraps: (7) Error 1406 (22001): Data too long for column 'component' at row 12 | Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *withstack.withStack (6) *errutil.withPrefix (7) *mysql.MySQLError Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.leafError

What do you expect to happen

Scan the project in Sonarcube

How to reproduce

Install v1.0.1-beta4@483c93e
add sonarcube connection and have a project with large java folder structure.

Anything else

No response

Version

v1.0.1-beta4@483c93e

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@Capone1983 Capone1983 added the type/bug This issue is a bug label Aug 6, 2024
@dosubot dosubot bot added the pr-type/bug-fix This PR fixes a bug label Aug 6, 2024
Copy link

dosubot bot commented Aug 6, 2024

To resolve the "Data too long for column 'component' at row 12" error encountered during a SonarCube scan in the pipeline, you need to modify the length of the 'component' column in the relevant tables. The provided migration script 20231007_modify_component_length.go shows how to change the column type to varchar(500).

Here is the relevant code snippet from the migration script:

package migrationscripts

import (
	"github.com/apache/incubator-devlake/core/context"
	"github.com/apache/incubator-devlake/core/dal"
	"github.com/apache/incubator-devlake/core/errors"
	"github.com/apache/incubator-devlake/core/plugin"
	"github.com/apache/incubator-devlake/helpers/migrationhelper"
)

var _ plugin.MigrationScript = (*modifyComponentLength)(nil)

type modifyComponentLength struct{}

type sonarqubeHotspot20231007 struct {
	Component string `gorm:"index;type:varchar(500)"`
}

func (sonarqubeHotspot20231007) TableName() string {
	return "_tool_sonarqube_hotspots"
}

type sonarqubeIssueCodeBlock20231007 struct {
	Component string `gorm:"index;type:varchar(500)"`
}

func (sonarqubeIssueCodeBlock20231007) TableName() string {
	return "_tool_sonarqube_issue_code_blocks"
}

type sonarqubeIssue20231007 struct {
	Component string `gorm:"index;type:varchar(500)"`
}

func (sonarqubeIssue20231007) TableName() string {
	return "_tool_sonarqube_issues"
}

func (script *modifyComponentLength) Up(basicRes context.BasicRes) errors.Error {
	db := basicRes.GetDal()
	err := migrationhelper.ChangeColumnsType[sonarqubeHotspot20231007](
		basicRes,
		script,
		sonarqubeHotspot20231007{}.TableName(),
		[]string{"component"},
		func(tmpColumnParams []interface{}) errors.Error {
			return db.UpdateColumn(
				&sonarqubeHotspot20231007{},
				"component",
				dal.DalClause{Expr: " ? ", Params: tmpColumnParams},
				dal.Where("? != '' ", tmpColumnParams...),
			)
		},
	)
	if err != nil {
		return err
	}

	err = migrationhelper.ChangeColumnsType[sonarqubeIssueCodeBlock20231007](
		basicRes,
		script,
		sonarqubeIssueCodeBlock20231007{}.TableName(),
		[]string{"component"},
		func(tmpColumnParams []interface{}) errors.Error {
			return db.UpdateColumn(
				&sonarqubeIssueCodeBlock20231007{},
				"component",
				dal.DalClause{Expr: " ? ", Params: tmpColumnParams},
				dal.Where("? != '' ", tmpColumnParams...),
			)
		},
	)
	if err != nil {
		return err
	}

	err = migrationhelper.ChangeColumnsType[sonarqubeIssue20231007](
		basicRes,
		script,
		sonarqubeIssue20231007{}.TableName(),
		[]string{"component"},
		func(tmpColumnParams []interface{}) errors.Error {
			return db.UpdateColumn(
				&sonarqubeIssue20231007{},
				"component",
				dal.DalClause{Expr: " ? ", Params: tmpColumnParams},
				dal.Where("? != '' ", tmpColumnParams...),
			)
		},
	)
	if err != nil {
		return err
	}

	return nil
}

func (*modifyComponentLength) Version() uint64 {
	return 20231007145127
}

func (*modifyComponentLength) Name() string {
	return "modify component type to varchar(500)"
}

This script modifies the component column type to varchar(500) for the following tables:

  • _tool_sonarqube_hotspots
  • _tool_sonarqube_issue_code_blocks
  • _tool_sonarqube_issues

To apply this migration, ensure that the script is executed in your database migration process. This should resolve the "Data too long for column 'component' at row 12" error [1][2].

To continue talking to Dosu, mention @dosu.

@d4x1 d4x1 closed this as completed in 7e97e43 Aug 14, 2024
github-actions bot pushed a commit that referenced this issue Aug 14, 2024
* refactor: remove extraction logic from github graphql collector

* fix: pull requests not being updated

* refactor: update github job extractor for single record structure

* refactor: update github issue extractor to extract single record

* refactor: update github deployment extractor to extract single records

* refactor: update github account/release extractor to adopt single record extraction

* docs: add comments

* fix: github issues not being updated

* fix: gitub deployment unit test

* fix: change jira issue component field to text type

* fix: column type should not be parameterized

* fix: #7852 sonarqube issues component field data too long

* fix: change issue.components to text to avoid data too long error

* docs: fix jira issues component tag

* fix: change bitbucket issue components field to text

* fix: change gitee issue components field to text

* fix: change github issue components field to text

* fix: change gitlab issue components field to text

* fix: #7715 Data too long for column 'project_key'

* fix(framework): update trigger api's request body schema (#7888)

* fix(framework): update trigger api's request body schema

* fix(framework): fix test errors

* fix(framework): fix test errors

* test(jira): add e2e test for custom account field in issues (#7894)

* fix: github graphql collectors are not refetching data in incremental mode (#7878)

* refactor: remove extraction logic from github graphql collector

* fix: pull requests not being updated

* refactor: update github job extractor for single record structure

* refactor: update github issue extractor to extract single record

* refactor: update github deployment extractor to extract single records

* refactor: update github account/release extractor to adopt single record extraction

* docs: add comments

* fix: github issues not being updated

* fix: gitub deployment unit test

* fix: github graphql issue collector should order records by CREATED_AT to avoid data missing

* fix: linting

---------

Co-authored-by: Lynwee <1507509064@qq.com>
d4x1 added a commit that referenced this issue Aug 14, 2024
…) (#7898)

* refactor: remove extraction logic from github graphql collector

* fix: pull requests not being updated

* refactor: update github job extractor for single record structure

* refactor: update github issue extractor to extract single record

* refactor: update github deployment extractor to extract single records

* refactor: update github account/release extractor to adopt single record extraction

* docs: add comments

* fix: github issues not being updated

* fix: gitub deployment unit test

* fix: change jira issue component field to text type

* fix: column type should not be parameterized

* fix: #7852 sonarqube issues component field data too long

* fix: change issue.components to text to avoid data too long error

* docs: fix jira issues component tag

* fix: change bitbucket issue components field to text

* fix: change gitee issue components field to text

* fix: change github issue components field to text

* fix: change gitlab issue components field to text

* fix: #7715 Data too long for column 'project_key'

* fix(framework): update trigger api's request body schema (#7888)

* fix(framework): update trigger api's request body schema

* fix(framework): fix test errors

* fix(framework): fix test errors

* test(jira): add e2e test for custom account field in issues (#7894)

* fix: github graphql collectors are not refetching data in incremental mode (#7878)

* refactor: remove extraction logic from github graphql collector

* fix: pull requests not being updated

* refactor: update github job extractor for single record structure

* refactor: update github issue extractor to extract single record

* refactor: update github deployment extractor to extract single records

* refactor: update github account/release extractor to adopt single record extraction

* docs: add comments

* fix: github issues not being updated

* fix: gitub deployment unit test

* fix: github graphql issue collector should order records by CREATED_AT to avoid data missing

* fix: linting

---------

Co-authored-by: Klesh Wong <zhenmian.huang@merico.dev>
Co-authored-by: Lynwee <1507509064@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr-type/bug-fix This PR fixes a bug type/bug This issue is a bug
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant