Commit d26644db authored by GitLab Bot's avatar GitLab Bot

Automatic merge of gitlab-org/gitlab master

parents 0011883a 28cb67da
---
title: Add rake task to cleanup description templates cache in batches
merge_request: 54706
author:
type: added
...@@ -13,13 +13,13 @@ GitLab provides Rake tasks for general maintenance. ...@@ -13,13 +13,13 @@ GitLab provides Rake tasks for general maintenance.
This command gathers information about your GitLab installation and the system it runs on. This command gathers information about your GitLab installation and the system it runs on.
These may be useful when asking for help or reporting issues. These may be useful when asking for help or reporting issues.
**Omnibus Installation** **For Omnibus installations**
```shell ```shell
sudo gitlab-rake gitlab:env:info sudo gitlab-rake gitlab:env:info
``` ```
**Source Installation** **For installations from source**
```shell ```shell
bundle exec rake gitlab:env:info RAILS_ENV=production bundle exec rake gitlab:env:info RAILS_ENV=production
...@@ -76,13 +76,13 @@ installations: a license cannot be installed into GitLab Community Edition. ...@@ -76,13 +76,13 @@ installations: a license cannot be installed into GitLab Community Edition.
These may be useful when raising tickets with Support, or for programmatically These may be useful when raising tickets with Support, or for programmatically
checking your license parameters. checking your license parameters.
**Omnibus Installation** **For Omnibus installations**
```shell ```shell
sudo gitlab-rake gitlab:license:info sudo gitlab-rake gitlab:license:info
``` ```
**Source Installation** **For installations from source**
```shell ```shell
bundle exec rake gitlab:license:info RAILS_ENV=production bundle exec rake gitlab:license:info RAILS_ENV=production
...@@ -119,13 +119,13 @@ You may also have a look at our troubleshooting guides for: ...@@ -119,13 +119,13 @@ You may also have a look at our troubleshooting guides for:
To run `gitlab:check`, run: To run `gitlab:check`, run:
**Omnibus Installation** **For Omnibus installations**
```shell ```shell
sudo gitlab-rake gitlab:check sudo gitlab-rake gitlab:check
``` ```
**Source Installation** **For installations from source**
```shell ```shell
bundle exec rake gitlab:check RAILS_ENV=production bundle exec rake gitlab:check RAILS_ENV=production
...@@ -182,13 +182,13 @@ Checking GitLab ... Finished ...@@ -182,13 +182,13 @@ Checking GitLab ... Finished
In some case it is necessary to rebuild the `authorized_keys` file. To do this, run: In some case it is necessary to rebuild the `authorized_keys` file. To do this, run:
**Omnibus Installation** **For Omnibus installations**
```shell ```shell
sudo gitlab-rake gitlab:shell:setup sudo gitlab-rake gitlab:shell:setup
``` ```
**Source Installation** **For installations from source**
```shell ```shell
cd /home/git/gitlab cd /home/git/gitlab
...@@ -203,18 +203,64 @@ You will lose any data stored in authorized_keys file. ...@@ -203,18 +203,64 @@ You will lose any data stored in authorized_keys file.
Do you want to continue (yes/no)? yes Do you want to continue (yes/no)? yes
``` ```
## Clear issue and merge request description template names cache
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/54706) in GitLab 13.10.
If the issue or merge request description template names in the dropdown
do not reflect the actual description template names in the repository, consider clearing
the Redis cache that stores the template names information.
You can clear the cache of
[all issues and merge request templates in the installation](#clear-cache-for-all-issue-and-merge-request-template-names)
or [in a specific project](#clear-cache-for-issue-and-merge-request-template-names-in-specific-projects).
### Clear cache for all issue and merge request template names
If you want to refresh issue and merge request templates for all projects:
**For Omnibus installations**
```shell
sudo gitlab-rake cache:clear:description_templates
```
**For installations from source**
```shell
cd /home/git/gitlab
sudo -u git -H bundle exec rake cache:clear:description_templates RAILS_ENV=production
```
### Clear cache for issue and merge request template names in specific projects
If you want to refresh issue and merge request templates for specific projects,
provide a comma-separated list of IDs as the `project_ids` parameter to the Rake task.
**For Omnibus installations**
```shell
sudo gitlab-rake cache:clear:description_templates project_ids=10,25,35
```
**For installations from source**
```shell
cd /home/git/gitlab
sudo -u git -H bundle exec rake cache:clear:description_templates project_ids=10,25,35 RAILS_ENV=production
```
## Clear Redis cache ## Clear Redis cache
If for some reason the dashboard displays the wrong information, you might want to If for some reason the dashboard displays the wrong information, you might want to
clear Redis' cache. To do this, run: clear Redis' cache. To do this, run:
**Omnibus Installation** **For Omnibus installations**
```shell ```shell
sudo gitlab-rake cache:clear sudo gitlab-rake cache:clear
``` ```
**Source Installation** **For installations from source**
```shell ```shell
cd /home/git/gitlab cd /home/git/gitlab
...@@ -229,7 +275,7 @@ missing some icons. In that case, try to precompile the assets again. ...@@ -229,7 +275,7 @@ missing some icons. In that case, try to precompile the assets again.
This only applies to source installations and does NOT apply to This only applies to source installations and does NOT apply to
Omnibus packages. Omnibus packages.
**Source Installation** **For installations from source**
```shell ```shell
cd /home/git/gitlab cd /home/git/gitlab
...@@ -249,13 +295,13 @@ Sometimes you need to know if your GitLab installation can connect to a TCP ...@@ -249,13 +295,13 @@ Sometimes you need to know if your GitLab installation can connect to a TCP
service on another machine - perhaps a PostgreSQL or HTTPS server. A Rake task service on another machine - perhaps a PostgreSQL or HTTPS server. A Rake task
is included to help you with this: is included to help you with this:
**Omnibus Installation** **For Omnibus installations**
```shell ```shell
sudo gitlab-rake gitlab:tcp_check[example.com,80] sudo gitlab-rake gitlab:tcp_check[example.com,80]
``` ```
**Source Installation** **For installations from source**
```shell ```shell
cd /home/git/gitlab cd /home/git/gitlab
......
...@@ -30,10 +30,10 @@ Metrics for a branch are read from the latest metrics report artifact (default f ...@@ -30,10 +30,10 @@ Metrics for a branch are read from the latest metrics report artifact (default f
For an MR, the values of these metrics from the feature branch are compared to the values from the target branch. Then they are displayed in the MR widget in this order: For an MR, the values of these metrics from the feature branch are compared to the values from the target branch. Then they are displayed in the MR widget in this order:
- Metrics that have been added by the MR. Marked with a **New** badge.
- Existing metrics with changed values. - Existing metrics with changed values.
- Existing metrics with unchanged values. - Metrics that have been added by the MR. Marked with a **New** badge.
- Metrics that have been removed by the MR. Marked with a **Removed** badge. - Metrics that have been removed by the MR. Marked with a **Removed** badge.
- Existing metrics with unchanged values.
## How to set it up ## How to set it up
......
...@@ -13,7 +13,8 @@ export const summaryStatus = (state) => { ...@@ -13,7 +13,8 @@ export const summaryStatus = (state) => {
}; };
export const metrics = (state) => [ export const metrics = (state) => [
...state.changedMetrics,
...state.newMetrics.map((metric) => ({ ...metric, isNew: true })), ...state.newMetrics.map((metric) => ({ ...metric, isNew: true })),
...state.existingMetrics,
...state.removedMetrics.map((metric) => ({ ...metric, wasRemoved: true })), ...state.removedMetrics.map((metric) => ({ ...metric, wasRemoved: true })),
...state.unchangedMetrics,
]; ];
...@@ -12,25 +12,24 @@ export default { ...@@ -12,25 +12,24 @@ export default {
state.hasError = false; state.hasError = false;
state.isLoading = false; state.isLoading = false;
state.changedMetrics =
response.existing_metrics?.filter((metric) => metric?.previous_value) || [];
state.newMetrics = response.new_metrics || []; state.newMetrics = response.new_metrics || [];
state.existingMetrics = [
...(response.existing_metrics?.filter((metric) => metric?.previous_value) || []),
...(response.existing_metrics?.filter((metric) => !metric?.previous_value) || []),
];
state.removedMetrics = response.removed_metrics || []; state.removedMetrics = response.removed_metrics || [];
state.unchangedMetrics =
response.existing_metrics?.filter((metric) => !metric?.previous_value) || [];
state.numberOfChanges = state.numberOfChanges =
state.existingMetrics.filter((metric) => metric?.previous_value !== undefined).length + state.changedMetrics.length + state.newMetrics.length + state.removedMetrics.length;
state.newMetrics.length +
state.removedMetrics.length;
}, },
[types.RECEIVE_METRICS_ERROR](state) { [types.RECEIVE_METRICS_ERROR](state) {
state.isLoading = false; state.isLoading = false;
state.hasError = true; state.hasError = true;
state.changedMetrics = [];
state.newMetrics = []; state.newMetrics = [];
state.existingMetrics = [];
state.removedMetrics = []; state.removedMetrics = [];
state.unchangedMetrics = [];
state.numberOfChanges = 0; state.numberOfChanges = 0;
}, },
......
...@@ -12,9 +12,10 @@ export default () => ({ ...@@ -12,9 +12,10 @@ export default () => ({
* previous_value: {String} * previous_value: {String}
* } * }
*/ */
changedMetrics: [],
newMetrics: [], newMetrics: [],
existingMetrics: [],
removedMetrics: [], removedMetrics: [],
unchangedMetrics: [],
numberOfChanges: 0, numberOfChanges: 0,
}); });
...@@ -15,6 +15,16 @@ module Integrations ...@@ -15,6 +15,16 @@ module Integrations
expose :due_date do |jira_issue| expose :due_date do |jira_issue|
jira_issue.duedate&.to_datetime&.utc jira_issue.duedate&.to_datetime&.utc
end end
expose :comments do |jira_issue|
jira_issue.renderedFields['comment']['comments'].map do |comment|
jira_user(comment['author']).merge(
note: Banzai::Pipeline::JiraGfmPipeline.call(comment['body'], project: project)[:output].to_html,
created_at: comment['created'].to_datetime.utc,
updated_at: comment['updated'].to_datetime.utc
)
end
end
end end
end end
end end
...@@ -41,13 +41,13 @@ module Integrations ...@@ -41,13 +41,13 @@ module Integrations
end end
expose :author do |jira_issue| expose :author do |jira_issue|
jira_user(jira_issue, :reporter) jira_user(jira_issue.fields['reporter'])
end end
expose :assignees do |jira_issue| expose :assignees do |jira_issue|
if jira_issue.assignee.present? if jira_issue.fields['assignee']
[ [
jira_user(jira_issue, :assignee) jira_user(jira_issue.fields['assignee'])
] ]
else else
[] []
...@@ -61,8 +61,6 @@ module Integrations ...@@ -61,8 +61,6 @@ module Integrations
expose :gitlab_web_url do |jira_issue| expose :gitlab_web_url do |jira_issue|
if ::Feature.enabled?(:jira_issues_show_integration, project, default_enabled: :yaml) if ::Feature.enabled?(:jira_issues_show_integration, project, default_enabled: :yaml)
project_integrations_jira_issue_path(project, jira_issue.key) project_integrations_jira_issue_path(project, jira_issue.key)
else
nil
end end
end end
...@@ -78,26 +76,24 @@ module Integrations ...@@ -78,26 +76,24 @@ module Integrations
private private
# rubocop:disable GitlabSecurity/PublicSend def jira_user(user)
def jira_user(jira_issue, user_type)
{ {
name: jira_issue.public_send(user_type).displayName, name: user['displayName'],
web_url: jira_web_url(jira_issue, user_type), web_url: jira_web_url(user),
avatar_url: jira_issue.public_send(user_type).avatarUrls['48x48'] avatar_url: user['avatarUrls']['48x48']
} }
end end
def jira_web_url(jira_issue, user_type) def jira_web_url(user)
# There are differences between Jira Cloud and Jira Server URLs and responses. # There are differences between Jira Cloud and Jira Server URLs and responses.
# accountId is only available on Jira Cloud. # accountId is only available on Jira Cloud.
# https://community.atlassian.com/t5/Jira-Questions/How-to-find-account-id-on-jira-on-premise/qaq-p/1168652 # https://community.atlassian.com/t5/Jira-Questions/How-to-find-account-id-on-jira-on-premise/qaq-p/1168652
if jira_issue.public_send(user_type).try(:accountId) if user['accountId'].present?
"#{base_web_url}/people/#{jira_issue.public_send(user_type).accountId}" "#{base_web_url}/people/#{user['accountId']}"
else else
"#{base_web_url}/secure/ViewProfile.jspa?name=#{jira_issue.public_send(user_type).name}" "#{base_web_url}/secure/ViewProfile.jspa?name=#{user['name']}"
end end
end end
# rubocop:enable GitlabSecurity/PublicSend
def base_web_url def base_web_url
@base_web_url ||= project.jira_service.url @base_web_url ||= project.jira_service.url
......
---
title: Sort metrics report MR widget - changed, new, removed, unchanged
merge_request: 55217
author:
type: changed
...@@ -73,7 +73,7 @@ describe('Grouped metrics reports app', () => { ...@@ -73,7 +73,7 @@ describe('Grouped metrics reports app', () => {
describe('when user expands to view metrics', () => { describe('when user expands to view metrics', () => {
beforeEach(() => { beforeEach(() => {
mockStore.state.numberOfChanges = 0; mockStore.state.numberOfChanges = 0;
mockStore.state.existingMetrics = [ mockStore.state.unchangedMetrics = [
{ {
name: 'name', name: 'name',
value: 'value', value: 'value',
...@@ -110,7 +110,7 @@ describe('Grouped metrics reports app', () => { ...@@ -110,7 +110,7 @@ describe('Grouped metrics reports app', () => {
describe('with no changes', () => { describe('with no changes', () => {
beforeEach(() => { beforeEach(() => {
mockStore.state.numberOfChanges = 0; mockStore.state.numberOfChanges = 0;
mockStore.state.existingMetrics = [ mockStore.state.unchangedMetrics = [
{ {
name: 'name', name: 'name',
value: 'value', value: 'value',
...@@ -129,7 +129,7 @@ describe('Grouped metrics reports app', () => { ...@@ -129,7 +129,7 @@ describe('Grouped metrics reports app', () => {
describe('with one change', () => { describe('with one change', () => {
beforeEach(() => { beforeEach(() => {
mockStore.state.numberOfChanges = 1; mockStore.state.numberOfChanges = 1;
mockStore.state.existingMetrics = [ mockStore.state.changedMetrics = [
{ {
name: 'name', name: 'name',
value: 'value', value: 'value',
...@@ -149,7 +149,7 @@ describe('Grouped metrics reports app', () => { ...@@ -149,7 +149,7 @@ describe('Grouped metrics reports app', () => {
describe('with multiple changes', () => { describe('with multiple changes', () => {
beforeEach(() => { beforeEach(() => {
mockStore.state.numberOfChanges = 2; mockStore.state.numberOfChanges = 2;
mockStore.state.existingMetrics = [ mockStore.state.changedMetrics = [
{ {
name: 'name', name: 'name',
value: 'value', value: 'value',
...@@ -212,7 +212,7 @@ describe('Grouped metrics reports app', () => { ...@@ -212,7 +212,7 @@ describe('Grouped metrics reports app', () => {
describe('when has metrics', () => { describe('when has metrics', () => {
beforeEach(() => { beforeEach(() => {
mockStore.state.numberOfChanges = 1; mockStore.state.numberOfChanges = 1;
mockStore.state.existingMetrics = [ mockStore.state.changedMetrics = [
{ {
name: 'name', name: 'name',
value: 'value', value: 'value',
......
...@@ -56,10 +56,10 @@ describe('metrics reports getters', () => { ...@@ -56,10 +56,10 @@ describe('metrics reports getters', () => {
}); });
}); });
describe('when state has existing metrics', () => { describe('when state has changed metrics', () => {
it('returns array with existing metrics', () => { it('returns array with changed metrics', () => {
const mockState = state(); const mockState = state();
mockState.existingMetrics = [{ name: 'name', value: 'value', previous_value: 'prev' }]; mockState.changedMetrics = [{ name: 'name', value: 'value', previous_value: 'prev' }];
const metricsResult = metrics(mockState); const metricsResult = metrics(mockState);
expect(metricsResult.length).toEqual(1); expect(metricsResult.length).toEqual(1);
...@@ -69,6 +69,18 @@ describe('metrics reports getters', () => { ...@@ -69,6 +69,18 @@ describe('metrics reports getters', () => {
}); });
}); });
describe('when state has unchanged metrics', () => {
it('returns array with unchanged metrics', () => {
const mockState = state();
mockState.unchangedMetrics = [{ name: 'name', value: 'value' }];
const metricsResult = metrics(mockState);
expect(metricsResult.length).toEqual(1);
expect(metricsResult[0].name).toEqual('name');
expect(metricsResult[0].value).toEqual('value');
});
});
describe('when state has removed metrics', () => { describe('when state has removed metrics', () => {
it('returns array with removed metrics', () => { it('returns array with removed metrics', () => {
const mockState = state(); const mockState = state();
...@@ -82,23 +94,31 @@ describe('metrics reports getters', () => { ...@@ -82,23 +94,31 @@ describe('metrics reports getters', () => {
}); });
}); });
describe('when state has new, existing, and removed metrics', () => { describe('when state has new, changed, unchanged, and removed metrics', () => {
it('returns array with new, existing, and removed metrics combined', () => { it('returns array with changed, new, removed, and unchanged metrics combined', () => {
const mockState = state(); const mockState = state();
mockState.newMetrics = [{ name: 'name1', value: 'value1' }]; mockState.changedMetrics = [{ name: 'name1', value: 'value1', previous_value: 'prev' }];
mockState.existingMetrics = [{ name: 'name2', value: 'value2', previous_value: 'prev' }]; mockState.newMetrics = [{ name: 'name2', value: 'value2' }];
mockState.removedMetrics = [{ name: 'name3', value: 'value3' }]; mockState.removedMetrics = [{ name: 'name3', value: 'value3' }];
mockState.unchangedMetrics = [{ name: 'name4', value: 'value4' }];
const metricsResult = metrics(mockState); const metricsResult = metrics(mockState);
expect(metricsResult.length).toEqual(3); expect(metricsResult.length).toEqual(4);
expect(metricsResult[0].name).toEqual('name1'); expect(metricsResult[0].name).toEqual('name1');
expect(metricsResult[0].value).toEqual('value1'); expect(metricsResult[0].value).toEqual('value1');
expect(metricsResult[0].isNew).toEqual(true); expect(metricsResult[0].previous_value).toEqual('prev');
expect(metricsResult[1].name).toEqual('name2'); expect(metricsResult[1].name).toEqual('name2');
expect(metricsResult[1].value).toEqual('value2'); expect(metricsResult[1].value).toEqual('value2');
expect(metricsResult[1].isNew).toEqual(true);
expect(metricsResult[2].name).toEqual('name3'); expect(metricsResult[2].name).toEqual('name3');
expect(metricsResult[2].value).toEqual('value3'); expect(metricsResult[2].value).toEqual('value3');
expect(metricsResult[2].wasRemoved).toEqual(true); expect(metricsResult[2].wasRemoved).toEqual(true);
expect(metricsResult[3].name).toEqual('name4');
expect(metricsResult[3].value).toEqual('value4');
}); });
}); });
......
...@@ -37,7 +37,7 @@ describe('metrics reports mutations', () => { ...@@ -37,7 +37,7 @@ describe('metrics reports mutations', () => {
}; };
mutations[types.RECEIVE_METRICS_SUCCESS](mockState, data); mutations[types.RECEIVE_METRICS_SUCCESS](mockState, data);
expect(mockState.existingMetrics).toEqual(data.existing_metrics); expect(mockState.unchangedMetrics).toEqual(data.existing_metrics);
expect(mockState.numberOfChanges).toEqual(0); expect(mockState.numberOfChanges).toEqual(0);
expect(mockState.isLoading).toEqual(false); expect(mockState.isLoading).toEqual(false);
}); });
...@@ -70,38 +70,10 @@ describe('metrics reports mutations', () => { ...@@ -70,38 +70,10 @@ describe('metrics reports mutations', () => {
}; };
mutations[types.RECEIVE_METRICS_SUCCESS](mockState, data); mutations[types.RECEIVE_METRICS_SUCCESS](mockState, data);
expect(mockState.existingMetrics).toEqual(data.existing_metrics); expect(mockState.changedMetrics).toEqual(data.existing_metrics);
expect(mockState.numberOfChanges).toEqual(1); expect(mockState.numberOfChanges).toEqual(1);
expect(mockState.isLoading).toEqual(false); expect(mockState.isLoading).toEqual(false);
}); });
it('should put changed metrics before unchanged metrics', () => {
const unchangedMetrics = [
{
name: 'an unchanged metric',
value: 'one',
},
{
name: 'another unchanged metric metric',
value: 'four',
},
];
const changedMetric = {
name: 'changed metric',
value: 'two',
previous_value: 'three',
};
const data = {
existing_metrics: [unchangedMetrics[0], changedMetric, unchangedMetrics[1]],
};
mutations[types.RECEIVE_METRICS_SUCCESS](mockState, data);
expect(mockState.existingMetrics).toEqual([
changedMetric,
unchangedMetrics[0],
unchangedMetrics[1],
]);
});
}); });
describe('RECEIVE_METRICS_ERROR', () => { describe('RECEIVE_METRICS_ERROR', () => {
......
...@@ -9,32 +9,54 @@ RSpec.describe Integrations::Jira::IssueDetailEntity do ...@@ -9,32 +9,54 @@ RSpec.describe Integrations::Jira::IssueDetailEntity do
let_it_be(:jira_service) { create(:jira_service, project: project, url: 'http://jira.com', api_url: 'http://api.jira.com') } let_it_be(:jira_service) { create(:jira_service, project: project, url: 'http://jira.com', api_url: 'http://api.jira.com') }
let(:reporter) do let(:reporter) do
double( {
'displayName' => 'reporter', 'displayName' => 'reporter',
'avatarUrls' => { '48x48' => 'http://reporter.avatar' }, 'avatarUrls' => { '48x48' => 'http://reporter.avatar' },
'name' => double 'name' => double
) }
end end
let(:assignee) do let(:assignee) do
double( {
'displayName' => 'assignee', 'displayName' => 'assignee',
'avatarUrls' => { '48x48' => 'http://assignee.avatar' }, 'avatarUrls' => { '48x48' => 'http://assignee.avatar' },
'name' => double 'name' => double
) }
end
let(:comment_author) do
{
'displayName' => 'comment_author',
'avatarUrls' => { '48x48' => 'http://comment_author.avatar' },
'name' => double
}
end end
let(:jira_issue) do let(:jira_issue) do
double( double(
summary: 'Title', summary: 'Title',
renderedFields: { 'description' => '<p>Description</p>' }, renderedFields: {
'description' => '<p>Description</p>',
'comment' => {
'comments' => [
{
'author' => comment_author,
'body' => '<p>Comment</p>',
'created' => '2020-06-25T15:50:00.000+0000',
'updated' => '2020-06-25T15:51:00.000+0000'
}
]
}
},
created: '2020-06-25T15:39:30.000+0000', created: '2020-06-25T15:39:30.000+0000',
updated: '2020-06-26T15:38:32.000+0000', updated: '2020-06-26T15:38:32.000+0000',
duedate: '2020-06-27T15:40:30.000+0000', duedate: '2020-06-27T15:40:30.000+0000',
resolutiondate: '2020-06-27T13:23:51.000+0000', resolutiondate: '2020-06-27T13:23:51.000+0000',
labels: ['backend'], labels: ['backend'],
reporter: reporter, fields: {
assignee: assignee, 'reporter' => reporter,
'assignee' => assignee
},
project: double(key: 'GL'), project: double(key: 'GL'),
key: 'GL-5', key: 'GL-5',
status: double(name: 'To Do') status: double(name: 'To Do')
...@@ -74,19 +96,30 @@ RSpec.describe Integrations::Jira::IssueDetailEntity do ...@@ -74,19 +96,30 @@ RSpec.describe Integrations::Jira::IssueDetailEntity do
], ],
web_url: 'http://jira.com/browse/GL-5', web_url: 'http://jira.com/browse/GL-5',
references: { relative: 'GL-5' }, references: { relative: 'GL-5' },
external_tracker: 'jira' external_tracker: 'jira',
comments: [
hash_including(
name: 'comment_author',
avatar_url: 'http://comment_author.avatar',
note: "<p dir=\"auto\">Comment</p>",
created_at: '2020-06-25T15:50:00.000+0000'.to_datetime.utc,
updated_at: '2020-06-25T15:51:00.000+0000'.to_datetime.utc
)
]
) )
end end
context 'with Jira Server configuration' do context 'with Jira Server configuration' do
before do before do
allow(reporter).to receive(:name).and_return('reporter@reporter.com') reporter['name'] = 'reporter@reporter.com'
allow(assignee).to receive(:name).and_return('assignee@assignee.com') assignee['name'] = 'assignee@assignee.com'
comment_author['name'] = 'comment@author.com'
end end
it 'returns the Jira Server profile URL' do it 'returns the Jira Server profile URL' do
expect(subject[:author]).to include(web_url: 'http://jira.com/secure/ViewProfile.jspa?name=reporter@reporter.com') expect(subject[:author]).to include(web_url: 'http://jira.com/secure/ViewProfile.jspa?name=reporter@reporter.com')
expect(subject[:assignees].first).to include(web_url: 'http://jira.com/secure/ViewProfile.jspa?name=assignee@assignee.com') expect(subject[:assignees].first).to include(web_url: 'http://jira.com/secure/ViewProfile.jspa?name=assignee@assignee.com')
expect(subject[:comments].first).to include(web_url: 'http://jira.com/secure/ViewProfile.jspa?name=comment@author.com')
end end
context 'with only url' do context 'with only url' do
...@@ -104,20 +137,20 @@ RSpec.describe Integrations::Jira::IssueDetailEntity do ...@@ -104,20 +137,20 @@ RSpec.describe Integrations::Jira::IssueDetailEntity do
context 'with Jira Cloud configuration' do context 'with Jira Cloud configuration' do
before do before do
allow(reporter).to receive(:accountId).and_return('12345') reporter['accountId'] = '12345'
allow(assignee).to receive(:accountId).and_return('67890') assignee['accountId'] = '67890'
comment_author['accountId'] = '54321'
end end
it 'returns the Jira Cloud profile URL' do it 'returns the Jira Cloud profile URL' do
expect(subject[:author]).to include(web_url: 'http://jira.com/people/12345') expect(subject[:author]).to include(web_url: 'http://jira.com/people/12345')
expect(subject[:assignees].first).to include(web_url: 'http://jira.com/people/67890') expect(subject[:assignees].first).to include(web_url: 'http://jira.com/people/67890')
expect(subject[:comments].first).to include(web_url: 'http://jira.com/people/54321')
end end
end end
context 'without assignee' do context 'without assignee' do
before do let(:assignee) { nil }
allow(jira_issue).to receive(:assignee).and_return(nil)
end
it 'returns an empty array' do it 'returns an empty array' do
expect(subject).to include(assignees: []) expect(subject).to include(assignees: [])
......
...@@ -9,19 +9,19 @@ RSpec.describe Integrations::Jira::IssueEntity do ...@@ -9,19 +9,19 @@ RSpec.describe Integrations::Jira::IssueEntity do
let_it_be(:jira_service) { create(:jira_service, project: project, url: 'http://jira.com', api_url: 'http://api.jira.com') } let_it_be(:jira_service) { create(:jira_service, project: project, url: 'http://jira.com', api_url: 'http://api.jira.com') }
let(:reporter) do let(:reporter) do
double( {
'displayName' => 'reporter', 'displayName' => 'reporter',
'avatarUrls' => { '48x48' => 'http://reporter.avatar' }, 'avatarUrls' => { '48x48' => 'http://reporter.avatar' },
'name' => double 'name' => double
) }
end end
let(:assignee) do let(:assignee) do
double( {
'displayName' => 'assignee', 'displayName' => 'assignee',
'avatarUrls' => { '48x48' => 'http://assignee.avatar' }, 'avatarUrls' => { '48x48' => 'http://assignee.avatar' },
'name' => double 'name' => double
) }
end end
let(:jira_issue) do let(:jira_issue) do
...@@ -31,8 +31,10 @@ RSpec.describe Integrations::Jira::IssueEntity do ...@@ -31,8 +31,10 @@ RSpec.describe Integrations::Jira::IssueEntity do
updated: '2020-06-26T15:38:32.000+0000', updated: '2020-06-26T15:38:32.000+0000',
resolutiondate: '2020-06-27T13:23:51.000+0000', resolutiondate: '2020-06-27T13:23:51.000+0000',
labels: ['backend'], labels: ['backend'],
reporter: reporter, fields: {
assignee: assignee, 'reporter' => reporter,
'assignee' => assignee
},
project: double(key: 'GL'), project: double(key: 'GL'),
key: 'GL-5', key: 'GL-5',
status: double(name: 'To Do') status: double(name: 'To Do')
...@@ -76,8 +78,8 @@ RSpec.describe Integrations::Jira::IssueEntity do ...@@ -76,8 +78,8 @@ RSpec.describe Integrations::Jira::IssueEntity do
context 'with Jira Server configuration' do context 'with Jira Server configuration' do
before do before do
allow(reporter).to receive(:name).and_return('reporter@reporter.com') reporter['name'] = 'reporter@reporter.com'
allow(assignee).to receive(:name).and_return('assignee@assignee.com') assignee['name'] = 'assignee@assignee.com'
end end
it 'returns the Jira Server profile URL' do it 'returns the Jira Server profile URL' do
...@@ -100,8 +102,8 @@ RSpec.describe Integrations::Jira::IssueEntity do ...@@ -100,8 +102,8 @@ RSpec.describe Integrations::Jira::IssueEntity do
context 'with Jira Cloud configuration' do context 'with Jira Cloud configuration' do
before do before do
allow(reporter).to receive(:accountId).and_return('12345') reporter['accountId'] = '12345'
allow(assignee).to receive(:accountId).and_return('67890') assignee['accountId'] = '67890'
end end
it 'returns the Jira Cloud profile URL' do it 'returns the Jira Cloud profile URL' do
...@@ -111,9 +113,7 @@ RSpec.describe Integrations::Jira::IssueEntity do ...@@ -111,9 +113,7 @@ RSpec.describe Integrations::Jira::IssueEntity do
end end
context 'without assignee' do context 'without assignee' do
before do let(:assignee) { nil }
allow(jira_issue).to receive(:assignee).and_return(nil)
end
it 'returns an empty array' do it 'returns an empty array' do
expect(subject).to include(assignees: []) expect(subject).to include(assignees: [])
......
...@@ -9,6 +9,8 @@ module Gitlab ...@@ -9,6 +9,8 @@ module Gitlab
class ProjectPipelineStatus class ProjectPipelineStatus
include Gitlab::Utils::StrongMemoize include Gitlab::Utils::StrongMemoize
ALL_PIPELINES_STATUS_PATTERN = 'projects/*/pipeline_status'
attr_accessor :sha, :status, :ref, :project, :loaded attr_accessor :sha, :status, :ref, :project, :loaded
def self.load_for_project(project) def self.load_for_project(project)
......
...@@ -52,7 +52,7 @@ module Gitlab ...@@ -52,7 +52,7 @@ module Gitlab
return unless included? return unless included?
strong_memoize(:errors) do strong_memoize(:errors) do
needs_errors [needs_errors, variable_expansion_errors].compact.flatten
end end
end end
...@@ -153,6 +153,12 @@ module Gitlab ...@@ -153,6 +153,12 @@ module Gitlab
@pipeline.project.actual_limits.ci_needs_size_limit @pipeline.project.actual_limits.ci_needs_size_limit
end end
def variable_expansion_errors
sorted_collection = evaluate_context.variables.sorted_collection(@pipeline.project)
errors = sorted_collection.errors
["#{name}: #{errors}"] if errors
end
def pipeline_attributes def pipeline_attributes
{ {
pipeline: @pipeline, pipeline: @pipeline,
......
# frozen_string_literal: true
module Gitlab
module Cleanup
module Redis
class BatchDeleteByPattern
REDIS_CLEAR_BATCH_SIZE = 1000 # There seems to be no speedup when pushing beyond 1,000
REDIS_SCAN_START_STOP = '0'.freeze # Magic value, see http://redis.io/commands/scan
attr_reader :patterns
def initialize(patterns)
raise ArgumentError.new('Argument should be an Array of patterns') unless patterns.is_a?(Array)
@patterns = patterns
end
def execute
return if patterns.blank?
batch_delete_cache_keys
end
private
def batch_delete_cache_keys
Gitlab::Redis::Cache.with do |redis|
patterns.each do |match|
cursor = REDIS_SCAN_START_STOP
loop do
cursor, keys = redis.scan(
cursor,
match: match,
count: REDIS_CLEAR_BATCH_SIZE
)
Gitlab::Instrumentation::RedisClusterValidator.allow_cross_slot_commands do
redis.del(*keys) if keys.any?
end
break if cursor == REDIS_SCAN_START_STOP
end
end
end
end
end
end
end
end
# frozen_string_literal: true
module Gitlab
module Cleanup
module Redis
class DescriptionTemplatesCacheKeysPatternBuilder
# project_ids - a list of project_ids for which to compute description templates cache keys or `:all` to compute
# a pattern that cover all description templates cache keys.
#
# Example
# * ::Gitlab::Cleanup::Redis::BatchDeleteDescriptionTemplates.new(:all).execute - to get 2
# patterns for all issue and merge request description templates cache keys.
#
# * ::Gitlab::Cleanup::Redis::BatchDeleteDescriptionTemplates.new([1,2,3,4]).execute - to get an array of
# patterns for each project's issue and merge request description templates cache keys.
def initialize(project_ids)
raise ArgumentError.new('project_ids can either be an array of project IDs or :all') if project_ids != :all && !project_ids.is_a?(Array)
@project_ids = parse_project_ids(project_ids)
end
def execute
case project_ids
when :all
all_instance_patterns
else
project_patterns
end
end
private
attr_reader :project_ids
def parse_project_ids(project_ids)
return project_ids if project_ids == :all
project_ids.map { |id| Integer(id) }
rescue ArgumentError
raise ArgumentError.new('Invalid Project ID. Please ensure all passed in project ids values are valid integer project ids.')
end
def project_patterns
cache_key_patterns = []
Project.id_in(project_ids).each_batch do |batch|
cache_key_patterns << batch.map do |pr|
next unless pr.repository.exists?
cache = Gitlab::RepositoryCache.new(pr.repository)
[repo_issue_templates_cache_key(cache), repo_merge_request_templates_cache_key(cache)]
end
end
cache_key_patterns.flatten.compact
end
def all_instance_patterns
[all_issue_templates_cache_key, all_merge_request_templates_cache_key]
end
def issue_templates_cache_key
Repository::METHOD_CACHES_FOR_FILE_TYPES[:issue_template]
end
def merge_request_templates_cache_key
Repository::METHOD_CACHES_FOR_FILE_TYPES[:merge_request_template]
end
def all_issue_templates_cache_key
"#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:#{issue_templates_cache_key}:*"
end
def all_merge_request_templates_cache_key
"#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:#{merge_request_templates_cache_key}:*"
end
def repo_issue_templates_cache_key(cache)
"#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:#{cache.cache_key(issue_templates_cache_key)}"
end
def repo_merge_request_templates_cache_key(cache)
"#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:#{cache.cache_key(merge_request_templates_cache_key)}"
end
end
end
end
end
...@@ -2,32 +2,22 @@ ...@@ -2,32 +2,22 @@
namespace :cache do namespace :cache do
namespace :clear do namespace :clear do
REDIS_CLEAR_BATCH_SIZE = 1000 # There seems to be no speedup when pushing beyond 1,000
REDIS_SCAN_START_STOP = '0'.freeze # Magic value, see http://redis.io/commands/scan
desc "GitLab | Cache | Clear redis cache" desc "GitLab | Cache | Clear redis cache"
task redis: :environment do task redis: :environment do
Gitlab::Redis::Cache.with do |redis| cache_key_patterns = %W[
cache_key_pattern = %W[#{Gitlab::Redis::Cache::CACHE_NAMESPACE}* #{Gitlab::Redis::Cache::CACHE_NAMESPACE}*
projects/*/pipeline_status] #{Gitlab::Cache::Ci::ProjectPipelineStatus::ALL_PIPELINES_STATUS_PATTERN}
]
cache_key_pattern.each do |match| ::Gitlab::Cleanup::Redis::BatchDeleteByPattern.new(cache_key_patterns).execute
cursor = REDIS_SCAN_START_STOP end
loop do
cursor, keys = redis.scan(
cursor,
match: match,
count: REDIS_CLEAR_BATCH_SIZE
)
Gitlab::Instrumentation::RedisClusterValidator.allow_cross_slot_commands do desc "GitLab | Cache | Clear description templates redis cache"
redis.del(*keys) if keys.any? task description_templates: :environment do
end project_ids = Array(ENV['project_ids']&.split(',')).map!(&:squish)
break if cursor == REDIS_SCAN_START_STOP cache_key_patterns = ::Gitlab::Cleanup::Redis::DescriptionTemplatesCacheKeysPatternBuilder.new(project_ids).execute
end ::Gitlab::Cleanup::Redis::BatchDeleteByPattern.new(cache_key_patterns).execute
end
end
end end
task all: [:redis] task all: [:redis]
......
...@@ -1025,4 +1025,75 @@ RSpec.describe Gitlab::Ci::Pipeline::Seed::Build do ...@@ -1025,4 +1025,75 @@ RSpec.describe Gitlab::Ci::Pipeline::Seed::Build do
end end
end end
end end
describe 'applying pipeline variables' do
subject { seed_build }
let(:pipeline_variables) { [] }
let(:pipeline) do
build(:ci_empty_pipeline, project: project, sha: head_sha, variables: pipeline_variables)
end
context 'containing variable references' do
let(:pipeline_variables) do
[
build(:ci_pipeline_variable, key: 'A', value: '$B'),
build(:ci_pipeline_variable, key: 'B', value: '$C')
]
end
context 'when FF :variable_inside_variable is enabled' do
before do
stub_feature_flags(variable_inside_variable: [project])
end
it "does not have errors" do
expect(subject.errors).to be_empty
end
end
end
context 'containing cyclic reference' do
let(:pipeline_variables) do
[
build(:ci_pipeline_variable, key: 'A', value: '$B'),
build(:ci_pipeline_variable, key: 'B', value: '$C'),
build(:ci_pipeline_variable, key: 'C', value: '$A')
]
end
context 'when FF :variable_inside_variable is disabled' do
before do
stub_feature_flags(variable_inside_variable: false)
end
it "does not have errors" do
expect(subject.errors).to be_empty
end
end
context 'when FF :variable_inside_variable is enabled' do
before do
stub_feature_flags(variable_inside_variable: [project])
end
it "returns an error" do
expect(subject.errors).to contain_exactly(
'rspec: circular variable reference detected: ["A", "B", "C"]')
end
context 'with job:rules:[if:]' do
let(:attributes) { { name: 'rspec', ref: 'master', rules: [{ if: '$C != null', when: 'always' }] } }
it "included? does not raise" do
expect { subject.included? }.not_to raise_error
end
it "included? returns true" do
expect(subject.included?).to eq(true)
end
end
end
end
end
end end
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe Gitlab::Cleanup::Redis::BatchDeleteByPattern, :clean_gitlab_redis_cache do
subject { described_class.new(patterns) }
describe 'execute' do
context 'when no patterns are passed' do
before do
expect(Gitlab::Redis::Cache).not_to receive(:with)
end
context 'with nil patterns' do
let(:patterns) { nil }
specify { expect { subject }.to raise_error(ArgumentError, 'Argument should be an Array of patterns') }
end
context 'with empty array patterns' do
let(:patterns) { [] }
specify { subject.execute }
end
end
context 'with patterns' do
context 'when key is not found' do
let(:patterns) { ['key'] }
before do
expect_any_instance_of(Redis).not_to receive(:del) # rubocop:disable RSpec/AnyInstanceOf
end
specify { subject.execute }
end
context 'with cache data' do
let(:cache_keys) { %w[key-test1 key-test2 key-test3 key-test4] }
before do
stub_const("#{described_class}::REDIS_CLEAR_BATCH_SIZE", 2)
write_to_cache
end
context 'with one key' do
let(:patterns) { ['key-test1'] }
it 'deletes the key' do
expect_any_instance_of(Redis).to receive(:del).with(patterns.first).once # rubocop:disable RSpec/AnyInstanceOf
subject.execute
end
end
context 'with many keys' do
let(:patterns) { %w[key-test1 key-test2] }
it 'deletes keys for each pattern separatelly' do
expect_any_instance_of(Redis).to receive(:del).with(patterns.first).once # rubocop:disable RSpec/AnyInstanceOf
expect_any_instance_of(Redis).to receive(:del).with(patterns.last).once # rubocop:disable RSpec/AnyInstanceOf
subject.execute
end
end
context 'with cache_keys over batch size' do
let(:patterns) { %w[key-test*] }
it 'deletes matched keys in batches' do
# redis scan returns the values in random order so just checking it is being called twice meaning
# scan returned results in 2 batches, which is what we expect
key_like = start_with('key-test')
expect_any_instance_of(Redis).to receive(:del).with(key_like, key_like).twice # rubocop:disable RSpec/AnyInstanceOf
subject.execute
end
end
end
end
end
end
def write_to_cache
Gitlab::Redis::Cache.with do |redis|
cache_keys.each_with_index do |cache_key, index|
redis.set(cache_key, index)
end
end
end
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe Gitlab::Cleanup::Redis::DescriptionTemplatesCacheKeysPatternBuilder, :clean_gitlab_redis_cache do
subject { described_class.new(project_ids).execute }
describe 'execute' do
context 'when build pattern for all description templates' do
RSpec.shared_examples 'all issue and merge request templates pattern' do
it 'builds pattern to remove all issue and merge request templates keys' do
expect(subject.count).to eq(2)
expect(subject).to match_array(%W[
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:issue_template_names_hash:*
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:merge_request_template_names_hash:*
])
end
end
context 'with project_ids == :all' do
let(:project_ids) { :all }
it_behaves_like 'all issue and merge request templates pattern'
end
end
context 'with project_ids' do
let_it_be(:project1) { create(:project, :repository) }
let_it_be(:project2) { create(:project, :repository) }
context 'with nil project_ids' do
let(:project_ids) { nil }
specify { expect { subject }.to raise_error(ArgumentError, 'project_ids can either be an array of project IDs or :all') }
end
context 'with project_ids as string' do
let(:project_ids) { '1' }
specify { expect { subject }.to raise_error(ArgumentError, 'project_ids can either be an array of project IDs or :all') }
end
context 'with invalid project_ids as array of strings' do
let(:project_ids) { %w[a b] }
specify { expect { subject }.to raise_error(ArgumentError, 'Invalid Project ID. Please ensure all passed in project ids values are valid integer project ids.') }
end
context 'with non existent project id' do
let(:project_ids) { [non_existing_record_id] }
it 'no patterns are built' do
expect(subject.count).to eq(0)
end
end
context 'with one project_id' do
let(:project_ids) { [project1.id] }
it 'builds patterns for the project' do
expect(subject.count).to eq(2)
expect(subject).to match_array(%W[
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:issue_template_names_hash:#{project1.full_path}:#{project1.id}
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:merge_request_template_names_hash:#{project1.full_path}:#{project1.id}
])
end
end
context 'with many project_ids' do
let(:project_ids) { [project1.id, project2.id] }
RSpec.shared_examples 'builds patterns for the given projects' do
it 'builds patterns for the given projects' do
expect(subject.count).to eq(4)
expect(subject).to match_array(%W[
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:issue_template_names_hash:#{project1.full_path}:#{project1.id}
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:merge_request_template_names_hash:#{project1.full_path}:#{project1.id}
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:issue_template_names_hash:#{project2.full_path}:#{project2.id}
#{Gitlab::Redis::Cache::CACHE_NAMESPACE}:merge_request_template_names_hash:#{project2.full_path}:#{project2.id}
])
end
end
it_behaves_like 'builds patterns for the given projects'
context 'with project_ids as string' do
let(:project_ids) { [project1.id.to_s, project2.id.to_s] }
it_behaves_like 'builds patterns for the given projects'
end
end
end
end
end
...@@ -2,7 +2,7 @@ ...@@ -2,7 +2,7 @@
require 'rake_helper' require 'rake_helper'
RSpec.describe 'clearing redis cache' do RSpec.describe 'clearing redis cache', :clean_gitlab_redis_cache do
before do before do
Rake.application.rake_require 'tasks/cache' Rake.application.rake_require 'tasks/cache'
end end
...@@ -21,4 +21,27 @@ RSpec.describe 'clearing redis cache' do ...@@ -21,4 +21,27 @@ RSpec.describe 'clearing redis cache' do
expect { run_rake_task('cache:clear:redis') }.to change { pipeline_status.has_cache? } expect { run_rake_task('cache:clear:redis') }.to change { pipeline_status.has_cache? }
end end
end end
describe 'invoking clear description templates cache rake task' do
using RSpec::Parameterized::TableSyntax
before do
stub_env('project_ids', project_ids) if project_ids
service = double(:service, execute: true)
expect(Gitlab::Cleanup::Redis::DescriptionTemplatesCacheKeysPatternBuilder).to receive(:new).with(expected_project_ids).and_return(service)
expect(Gitlab::Cleanup::Redis::BatchDeleteByPattern).to receive(:new).and_return(service)
end
where(:project_ids, :expected_project_ids) do
nil | [] # this acts as no argument is being passed
'1' | %w[1]
'1, 2, 3' | %w[1 2 3]
'1, 2, some-string, 3' | %w[1 2 some-string 3]
end
with_them do
specify { run_rake_task('cache:clear:description_templates') }
end
end
end end
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment