Commit 23997246 authored by GitLab Bot's avatar GitLab Bot

Add latest changes from gitlab-org/gitlab@master

parent 6755df10
...@@ -1328,7 +1328,7 @@ DEPENDENCIES ...@@ -1328,7 +1328,7 @@ DEPENDENCIES
request_store (~> 1.3) request_store (~> 1.3)
responders (~> 3.0) responders (~> 3.0)
retriable (~> 3.1.2) retriable (~> 3.1.2)
rouge (~> 3.11.0) rouge (~> 3.15.0)
rqrcode-rails3 (~> 0.1.7) rqrcode-rails3 (~> 0.1.7)
rspec-parameterized rspec-parameterized
rspec-rails (~> 4.0.0.beta3) rspec-rails (~> 4.0.0.beta3)
......
---
title: Document CI job activity limit for pipeline creation
merge_request: 23246
author:
type: added
...@@ -42,3 +42,35 @@ Activity history for projects and individuals' profiles was limited to one year ...@@ -42,3 +42,35 @@ Activity history for projects and individuals' profiles was limited to one year
A maximum number of project webhooks applies to each GitLab.com tier. Check the A maximum number of project webhooks applies to each GitLab.com tier. Check the
[Maximum number of webhooks (per tier)](../user/project/integrations/webhooks.md#maximum-number-of-webhooks-per-tier) [Maximum number of webhooks (per tier)](../user/project/integrations/webhooks.md#maximum-number-of-webhooks-per-tier)
section in the Webhooks page. section in the Webhooks page.
## CI/CD limits
### Number of jobs in active pipelines
> [Introduced](https://gitlab.com/gitlab-org/gitlab/issues/32823) in GitLab 12.6.
The total number of jobs in active pipelines can be limited per project. This limit is checked
each time a new pipeline is created. An active pipeline is any pipeline in one of the following states:
- `created`
- `pending`
- `running`
If a new pipeline would cause the total number of jobs to exceed the limit, the pipeline
will fail with a `job_activity_limit_exceeded` error.
- On GitLab.com different [limits are defined per plan](../user/gitlab_com/index.md#gitlab-cicd) and they affect all projects under that plan.
- On [GitLab Starter](https://about.gitlab.com/pricing/#self-managed) tier or higher self-hosted installations, this limit is defined for the `default` plan that affects all projects.
This limit is disabled by default.
To set this limit on a self-hosted installation, run the following in the
[GitLab Rails console](https://docs.gitlab.com/omnibus/maintenance/#starting-a-rails-console-session):
```ruby
# If limits don't exist for the default plan, you can create one with:
# Plan.default.create_limits!
Plan.default.limits.update!(ci_active_jobs: 500)
```
Set the limit to `0` to disable it.
...@@ -121,7 +121,9 @@ In this example we can see that server processed an HTTP request with URL ...@@ -121,7 +121,9 @@ In this example we can see that server processed an HTTP request with URL
## `api_json.log` ## `api_json.log`
Introduced in GitLab 10.0, this file lives in > Introduced in GitLab 10.0.
This file lives in
`/var/log/gitlab/gitlab-rails/api_json.log` for Omnibus GitLab packages or in `/var/log/gitlab/gitlab-rails/api_json.log` for Omnibus GitLab packages or in
`/home/git/gitlab/log/api_json.log` for installations from source. `/home/git/gitlab/log/api_json.log` for installations from source.
...@@ -159,6 +161,21 @@ October 07, 2014 11:25: User "Claudie Hodkiewicz" (nasir_stehr@olson.co.uk) was ...@@ -159,6 +161,21 @@ October 07, 2014 11:25: User "Claudie Hodkiewicz" (nasir_stehr@olson.co.uk) was
October 07, 2014 11:25: Project "project133" was removed October 07, 2014 11:25: Project "project133" was removed
``` ```
## `application_json.log`
> [Introduced](https://gitlab.com/gitlab-org/gitlab/issues/22812) in GitLab 12.7.
This file lives in `/var/log/gitlab/gitlab-rails/application_json.log` for
Omnibus GitLab packages or in `/home/git/gitlab/log/application_json.log` for
installations from source.
It contains the JSON version of the logs in `application.log` like the example below:
``` json
{"severity":"INFO","time":"2020-01-14T13:35:15.466Z","correlation_id":"3823a1550b64417f9c9ed8ee0f48087e","message":"User \"Administrator\" (admin@example.com) was created"}
{"severity":"INFO","time":"2020-01-14T13:35:15.466Z","correlation_id":"78e3df10c9a18745243d524540bd5be4","message":"Project \"project133\" was removed"}
```
## `integrations_json.log` ## `integrations_json.log`
This file lives in `/var/log/gitlab/gitlab-rails/integrations_json.log` for This file lives in `/var/log/gitlab/gitlab-rails/integrations_json.log` for
...@@ -174,7 +191,9 @@ It contains information about [integrations](../user/project/integrations/projec ...@@ -174,7 +191,9 @@ It contains information about [integrations](../user/project/integrations/projec
## `kubernetes.log` ## `kubernetes.log`
Introduced in GitLab 11.6. This file lives in > Introduced in GitLab 11.6.
This file lives in
`/var/log/gitlab/gitlab-rails/kubernetes.log` for Omnibus GitLab `/var/log/gitlab/gitlab-rails/kubernetes.log` for Omnibus GitLab
packages or in `/home/git/gitlab/log/kubernetes.log` for packages or in `/home/git/gitlab/log/kubernetes.log` for
installations from source. installations from source.
...@@ -320,13 +339,17 @@ It logs information whenever a [repository check is run][repocheck] on a project ...@@ -320,13 +339,17 @@ It logs information whenever a [repository check is run][repocheck] on a project
## `importer.log` ## `importer.log`
Introduced in GitLab 11.3. This file lives in `/var/log/gitlab/gitlab-rails/importer.log` for > Introduced in GitLab 11.3.
This file lives in `/var/log/gitlab/gitlab-rails/importer.log` for
Omnibus GitLab packages or in `/home/git/gitlab/log/importer.log` for Omnibus GitLab packages or in `/home/git/gitlab/log/importer.log` for
installations from source. installations from source.
## `auth.log` ## `auth.log`
Introduced in GitLab 12.0. This file lives in `/var/log/gitlab/gitlab-rails/auth.log` for > Introduced in GitLab 12.0.
This file lives in `/var/log/gitlab/gitlab-rails/auth.log` for
Omnibus GitLab packages or in `/home/git/gitlab/log/auth.log` for Omnibus GitLab packages or in `/home/git/gitlab/log/auth.log` for
installations from source. installations from source.
...@@ -356,7 +379,9 @@ GraphQL queries are recorded in that file. For example: ...@@ -356,7 +379,9 @@ GraphQL queries are recorded in that file. For example:
## `migrations.log` ## `migrations.log`
Introduced in GitLab 12.3. This file lives in `/var/log/gitlab/gitlab-rails/migrations.log` for > Introduced in GitLab 12.3.
This file lives in `/var/log/gitlab/gitlab-rails/migrations.log` for
Omnibus GitLab packages or in `/home/git/gitlab/log/migrations.log` for Omnibus GitLab packages or in `/home/git/gitlab/log/migrations.log` for
installations from source. installations from source.
...@@ -406,7 +431,9 @@ It is stored at: ...@@ -406,7 +431,9 @@ It is stored at:
## `elasticsearch.log` ## `elasticsearch.log`
Introduced in GitLab 12.6. This file lives in > Introduced in GitLab 12.6.
This file lives in
`/var/log/gitlab/gitlab-rails/elasticsearch.log` for Omnibus GitLab `/var/log/gitlab/gitlab-rails/elasticsearch.log` for Omnibus GitLab
packages or in `/home/git/gitlab/log/elasticsearch.log` for installations packages or in `/home/git/gitlab/log/elasticsearch.log` for installations
from source. from source.
......
...@@ -75,6 +75,7 @@ Below are the current settings regarding [GitLab CI/CD](../../ci/README.md). ...@@ -75,6 +75,7 @@ Below are the current settings regarding [GitLab CI/CD](../../ci/README.md).
| Artifacts maximum size (uncompressed) | 1G | 100M | | Artifacts maximum size (uncompressed) | 1G | 100M |
| Artifacts [expiry time](../../ci/yaml/README.md#artifactsexpire_in) | kept forever | deleted after 30 days unless otherwise specified | | Artifacts [expiry time](../../ci/yaml/README.md#artifactsexpire_in) | kept forever | deleted after 30 days unless otherwise specified |
| Scheduled Pipeline Cron | `*/5 * * * *` | `*/19 * * * *` | | Scheduled Pipeline Cron | `*/5 * * * *` | `*/19 * * * *` |
| [Max jobs in active pipelines](../../administration/instance_limits.md#number-of-jobs-in-active-pipelines) | `500` for Free tier, unlimited otherwise | Unlimited
## Repository size limit ## Repository size limit
...@@ -95,7 +96,11 @@ IP based firewall can be configured by looking up all ...@@ -95,7 +96,11 @@ IP based firewall can be configured by looking up all
## Shared Runners ## Shared Runners
Shared Runners on GitLab.com run in [autoscale mode] and powered by Google Cloud Platform. GitLab offers Linux and Windows shared runners hosted on GitLab.com for executing your pipelines.
### Linux Shared Runners
Linux Shared Runners on GitLab.com run in [autoscale mode] and are powered by Google Cloud Platform.
Autoscaling means reduced waiting times to spin up CI/CD jobs, and isolated VMs for each project, Autoscaling means reduced waiting times to spin up CI/CD jobs, and isolated VMs for each project,
thus maximizing security. They're free to use for public open source projects and limited thus maximizing security. They're free to use for public open source projects and limited
to 2000 CI minutes per month per group for private projects. More minutes to 2000 CI minutes per month per group for private projects. More minutes
...@@ -122,7 +127,7 @@ Below are the shared Runners settings. ...@@ -122,7 +127,7 @@ Below are the shared Runners settings.
| Default Docker image | `ruby:2.5` | - | | Default Docker image | `ruby:2.5` | - |
| `privileged` (run [Docker in Docker]) | `true` | `false` | | `privileged` (run [Docker in Docker]) | `true` | `false` |
### `config.toml` #### `config.toml`
The full contents of our `config.toml` are: The full contents of our `config.toml` are:
...@@ -184,6 +189,158 @@ sentry_dsn = "X" ...@@ -184,6 +189,158 @@ sentry_dsn = "X"
BucketName = "bucket-name" BucketName = "bucket-name"
``` ```
### Windows Shared Runners (beta)
The Windows Shared Runners are currently in
[beta](https://about.gitlab.com/handbook/product/#beta) and should not be used
for production workloads.
During the beta period for groups and private projects the use of
Windows Shared Runners will count towards the [shared runner pipeline
quota](https://docs.gitlab.com/ee/user/admin_area/settings/continuous_integration.html#shared-runners-pipeline-minutes-quota-starter-only)
as if they are Linux Runners, we do have plans to change this in
[#30835](https://gitlab.com/gitlab-org/gitlab/issues/30834).
Windows Shared Runners on GitLab.com automatically autoscale by
launching virtual machines on the Google Cloud Platform. This solution uses
a new [autoscaling driver](https://gitlab.com/gitlab-org/ci-cd/custom-executor-drivers/autoscaler/tree/master/docs/readme.md)
developed by GitLab for the [custom executor](https://docs.gitlab.com/runner/executors/custom.html).
Windows Shared Runners execute your CI/CD jobs on `n1-standard-2` instances with 2
vCPUs and 7.5GB RAM. You can find a full list of available Windows packages in the
[package documentation](https://gitlab.com/gitlab-org/ci-cd/shared-runners/images/gcp/windows-containers/blob/master/cookbooks/preinstalled-software/README.md).
We want to keep iterating to get Windows Shared Runners in a stable state and
[generally available](https://about.gitlab.com/handbook/product/#generally-available-ga).
You can follow our work towards this goal in the
[related epic](https://gitlab.com/groups/gitlab-org/-/epics/2162).
#### Configuration
The full contents of our `config.toml` are:
```toml
concurrent = 10
check_interval = 3
[[runners]]
name = "windows-runner"
url = "https://gitlab.com/"
token = "TOKEN"
executor = "custom"
builds_dir = "C:\\GitLab-Runner\\builds"
cache_dir = "C:\\GitLab-Runner\\cache"
shell = "powershell"
[runners.custom]
config_exec = "C:\\GitLab-Runner\\autoscaler\\autoscaler.exe"
config_args = ["--config", "C:\\GitLab-Runner\\autoscaler\\config.toml", "custom", "config"]
prepare_exec = "C:\\GitLab-Runner\\autoscaler\\autoscaler.exe"
prepare_args = ["--config", "C:\\GitLab-Runner\\autoscaler\\config.toml", "custom", "prepare"]
run_exec = "C:\\GitLab-Runner\\autoscaler\\autoscaler.exe"
run_args = ["--config", "C:\\GitLab-Runner\\autoscaler\\config.toml", "custom", "run"]
cleanup_exec = "C:\\GitLab-Runner\\autoscaler\\autoscaler.exe"
cleanup_args = ["--config", "C:\\GitLab-Runner\\autoscaler\\config.toml", "custom", "cleanup"]
```
The full contents of our `autoscaler/config.toml` are:
```toml
Provider = "gcp"
Executor = "winrm"
OS = "windows"
LogLevel = "info"
LogFormat = "text"
LogFile = "C:\\GitLab-Runner\\autoscaler\\autoscaler.log"
VMTag = "windows"
[GCP]
ServiceAccountFile = "PATH"
Project = "some-project-df9323"
Zone = "us-east1-c"
MachineType = "n1-standard-2"
Image = "IMAGE"
DiskSize = 50
DiskType = "pd-standard"
Subnetwork = "default"
Network = "default"
Tags = ["TAGS"]
Username = "gitlab_runner"
[WinRM]
MaximumTimeout = 3600
ExecutionMaxRetries = 0
[ProviderCache]
Enabled = true
Directory = "C:\\GitLab-Runner\\autoscaler\\machines"
```
#### Example
Below is a simple `.gitlab-ci.yml` file to show how to start using the
Windows Shared Runners:
```yaml
.shared_windows_runners:
tags:
- shared
- windows
- windows-1809
stages:
- build
- test
before_script:
- date +"%H"
- echo ${HOUR}
- echo "started by ${GITLAB_USER_NAME}"
build:
extends:
- .shared_windows_runners
stage: build
script:
- echo "running scripts in the build job"
test:
extends:
- .shared_windows_runners
stage: test
script:
- echo "running scripts in the test job"
```
#### Limitations and known issues
- All the limitations mentioned in our [beta
definition](https://about.gitlab.com/handbook/product/#beta).
- The average provisioning time for a new Windows VM is 5 minutes.
This means that for the beta you will notice slower build start times
on the Windows Shared Runner fleet compared to Linux. In a future
release we will add the ability to the autoscaler which will enable
the pre-warming of virtual machines. This will significantly reduce
the time it takes to provision a VM on the Windows fleet. You can
follow along in this
[issue](https://gitlab.com/gitlab-org/ci-cd/custom-executor-drivers/autoscaler/issues/32).
- The Windows Shared Runner fleet may be unavailable occasionally
for maintenance or updates.
- The Windows Shared Runner virtual machine instances do not use the
GitLab Docker executor. This means that unlike the Linux Shared
Runners, you will not be able to specify `image` and `services` in
your pipeline configuration.
- For the beta release, we have included a set of software packages in
the base VM image. If your CI job requires additional software that's
not included in this list, then you will need to add installation
commands to [`before_script`](../../ci/yaml/README.md#before_script-and-after_script) or [`script`](../../ci/yaml/README.md#script) to install the required
software. Note that each job runs on a new VM instance, so the
installation of additional software packages needs to be repeated for
each job in your pipeline.
- The job may stay in a pending state for longer than the
Linux shared Runners.
- There is the possibility that we introduce breaking changes which will
require updates to pipelines that are using the Windows Shared Runner
fleet.
## Sidekiq ## Sidekiq
GitLab.com runs [Sidekiq](https://sidekiq.org) with arguments `--timeout=4 --concurrency=4` GitLab.com runs [Sidekiq](https://sidekiq.org) with arguments `--timeout=4 --concurrency=4`
......
...@@ -54,8 +54,8 @@ Design Management requires that projects are using ...@@ -54,8 +54,8 @@ Design Management requires that projects are using
- Design Management data [won't be moved](https://gitlab.com/gitlab-org/gitlab/issues/13426) - Design Management data [won't be moved](https://gitlab.com/gitlab-org/gitlab/issues/13426)
when an issue is moved, nor [deleted](https://gitlab.com/gitlab-org/gitlab/issues/13427) when an issue is moved, nor [deleted](https://gitlab.com/gitlab-org/gitlab/issues/13427)
when an issue is deleted. when an issue is deleted.
- Design Management - From GitLab 12.7, Design Management data [can be replicated](../../../administration/geo/replication/datatypes.md#limitations-on-replicationverification)
[isn't supported by Geo](https://gitlab.com/groups/gitlab-org/-/epics/1633) yet. by Geo but [not verified](https://gitlab.com/gitlab-org/gitlab/issues/32467).
- Only the latest version of the designs can be deleted. - Only the latest version of the designs can be deleted.
- Deleted designs cannot be recovered but you can see them on previous designs versions. - Deleted designs cannot be recovered but you can see them on previous designs versions.
......
...@@ -8,7 +8,7 @@ import TimeSeries from '~/monitoring/components/charts/time_series.vue'; ...@@ -8,7 +8,7 @@ import TimeSeries from '~/monitoring/components/charts/time_series.vue';
import * as types from '~/monitoring/stores/mutation_types'; import * as types from '~/monitoring/stores/mutation_types';
import { import {
deploymentData, deploymentData,
metricsGroupsAPIResponse, metricsDashboardPayload,
mockedQueryResultPayload, mockedQueryResultPayload,
mockProjectDir, mockProjectDir,
mockHost, mockHost,
...@@ -34,7 +34,7 @@ describe('Time series component', () => { ...@@ -34,7 +34,7 @@ describe('Time series component', () => {
store.commit( store.commit(
`monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`, `monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`,
metricsGroupsAPIResponse, metricsDashboardPayload,
); );
store.commit(`monitoringDashboard/${types.RECEIVE_DEPLOYMENTS_DATA_SUCCESS}`, deploymentData); store.commit(`monitoringDashboard/${types.RECEIVE_DEPLOYMENTS_DATA_SUCCESS}`, deploymentData);
......
...@@ -14,9 +14,8 @@ import { createStore } from '~/monitoring/stores'; ...@@ -14,9 +14,8 @@ import { createStore } from '~/monitoring/stores';
import * as types from '~/monitoring/stores/mutation_types'; import * as types from '~/monitoring/stores/mutation_types';
import { setupComponentStore, propsData } from '../init_utils'; import { setupComponentStore, propsData } from '../init_utils';
import { import {
metricsGroupsAPIResponse, metricsDashboardPayload,
mockedQueryResultPayload, mockedQueryResultPayload,
mockApiEndpoint,
environmentData, environmentData,
dashboardGitResponse, dashboardGitResponse,
} from '../mock_data'; } from '../mock_data';
...@@ -33,6 +32,9 @@ describe('Dashboard', () => { ...@@ -33,6 +32,9 @@ describe('Dashboard', () => {
wrapper = shallowMount(Dashboard, { wrapper = shallowMount(Dashboard, {
localVue, localVue,
propsData: { ...propsData, ...props }, propsData: { ...propsData, ...props },
methods: {
fetchData: jest.fn(),
},
store, store,
...options, ...options,
}); });
...@@ -42,6 +44,9 @@ describe('Dashboard', () => { ...@@ -42,6 +44,9 @@ describe('Dashboard', () => {
wrapper = mount(Dashboard, { wrapper = mount(Dashboard, {
localVue, localVue,
propsData: { ...propsData, ...props }, propsData: { ...propsData, ...props },
methods: {
fetchData: jest.fn(),
},
store, store,
...options, ...options,
}); });
...@@ -55,21 +60,16 @@ describe('Dashboard', () => { ...@@ -55,21 +60,16 @@ describe('Dashboard', () => {
afterEach(() => { afterEach(() => {
if (wrapper) { if (wrapper) {
wrapper.destroy(); wrapper.destroy();
wrapper = null;
} }
mock.restore(); mock.restore();
}); });
describe('no metrics are available yet', () => { describe('no metrics are available yet', () => {
beforeEach(() => { beforeEach(() => {
mock.onGet(mockApiEndpoint).reply(statusCodes.OK, metricsGroupsAPIResponse);
createShallowWrapper(); createShallowWrapper();
}); });
afterEach(() => {
wrapper.destroy();
});
it('shows the environment selector', () => { it('shows the environment selector', () => {
expect(wrapper.vm.$el.querySelector('.js-environments-dropdown')).toBeTruthy(); expect(wrapper.vm.$el.querySelector('.js-environments-dropdown')).toBeTruthy();
}); });
...@@ -77,29 +77,19 @@ describe('Dashboard', () => { ...@@ -77,29 +77,19 @@ describe('Dashboard', () => {
describe('no data found', () => { describe('no data found', () => {
beforeEach(done => { beforeEach(done => {
mock.onGet(mockApiEndpoint).reply(statusCodes.OK, metricsGroupsAPIResponse);
createShallowWrapper(); createShallowWrapper();
wrapper.vm.$nextTick(done); wrapper.vm.$nextTick(done);
}); });
afterEach(() => {
wrapper.destroy();
});
it('shows the environment selector dropdown', () => { it('shows the environment selector dropdown', () => {
expect(wrapper.vm.$el.querySelector('.js-environments-dropdown')).toBeTruthy(); expect(wrapper.vm.$el.querySelector('.js-environments-dropdown')).toBeTruthy();
}); });
}); });
describe('request information to the server', () => { describe('request information to the server', () => {
beforeEach(() => {
mock.onGet(mockApiEndpoint).reply(200, metricsGroupsAPIResponse);
});
it('shows up a loading state', done => { it('shows up a loading state', done => {
createShallowWrapper({ hasMetrics: true }); createShallowWrapper({ hasMetrics: true }, { methods: {} });
wrapper.vm wrapper.vm
.$nextTick() .$nextTick()
...@@ -153,17 +143,11 @@ describe('Dashboard', () => { ...@@ -153,17 +143,11 @@ describe('Dashboard', () => {
describe('when all requests have been commited by the store', () => { describe('when all requests have been commited by the store', () => {
beforeEach(() => { beforeEach(() => {
mock.onGet(mockApiEndpoint).reply(statusCodes.OK, metricsGroupsAPIResponse);
createMountedWrapper({ hasMetrics: true }, { stubs: ['graph-group', 'panel-type'] }); createMountedWrapper({ hasMetrics: true }, { stubs: ['graph-group', 'panel-type'] });
setupComponentStore(wrapper); setupComponentStore(wrapper);
}); });
afterEach(() => {
wrapper.destroy();
});
it('renders the environments dropdown with a number of environments', done => { it('renders the environments dropdown with a number of environments', done => {
wrapper.vm wrapper.vm
.$nextTick() .$nextTick()
...@@ -211,7 +195,7 @@ describe('Dashboard', () => { ...@@ -211,7 +195,7 @@ describe('Dashboard', () => {
wrapper.vm.$store.commit( wrapper.vm.$store.commit(
`monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`, `monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`,
metricsGroupsAPIResponse, metricsDashboardPayload,
); );
wrapper.vm.$store.commit( wrapper.vm.$store.commit(
`monitoringDashboard/${types.RECEIVE_METRIC_RESULT_SUCCESS}`, `monitoringDashboard/${types.RECEIVE_METRIC_RESULT_SUCCESS}`,
...@@ -247,8 +231,6 @@ describe('Dashboard', () => { ...@@ -247,8 +231,6 @@ describe('Dashboard', () => {
describe('when one of the metrics is missing', () => { describe('when one of the metrics is missing', () => {
beforeEach(done => { beforeEach(done => {
mock.onGet(mockApiEndpoint).reply(200, metricsGroupsAPIResponse);
createShallowWrapper({ hasMetrics: true }); createShallowWrapper({ hasMetrics: true });
setupComponentStore(wrapper); setupComponentStore(wrapper);
...@@ -278,10 +260,6 @@ describe('Dashboard', () => { ...@@ -278,10 +260,6 @@ describe('Dashboard', () => {
const findDraggablePanels = () => wrapper.findAll('.js-draggable-panel'); const findDraggablePanels = () => wrapper.findAll('.js-draggable-panel');
const findRearrangeButton = () => wrapper.find('.js-rearrange-button'); const findRearrangeButton = () => wrapper.find('.js-rearrange-button');
beforeEach(() => {
mock.onGet(mockApiEndpoint).reply(statusCodes.OK, metricsGroupsAPIResponse);
});
beforeEach(done => { beforeEach(done => {
createShallowWrapper({ hasMetrics: true }); createShallowWrapper({ hasMetrics: true });
...@@ -290,10 +268,6 @@ describe('Dashboard', () => { ...@@ -290,10 +268,6 @@ describe('Dashboard', () => {
wrapper.vm.$nextTick(done); wrapper.vm.$nextTick(done);
}); });
afterEach(() => {
wrapper.destroy();
});
it('wraps vuedraggable', () => { it('wraps vuedraggable', () => {
expect(findDraggablePanels().exists()).toBe(true); expect(findDraggablePanels().exists()).toBe(true);
expect(findDraggablePanels().length).toEqual(expectedPanelCount); expect(findDraggablePanels().length).toEqual(expectedPanelCount);
...@@ -332,7 +306,7 @@ describe('Dashboard', () => { ...@@ -332,7 +306,7 @@ describe('Dashboard', () => {
it('metrics can be swapped', done => { it('metrics can be swapped', done => {
const firstDraggable = findDraggables().at(0); const firstDraggable = findDraggables().at(0);
const mockMetrics = [...metricsGroupsAPIResponse.panel_groups[1].panels]; const mockMetrics = [...metricsDashboardPayload.panel_groups[1].panels];
const firstTitle = mockMetrics[0].title; const firstTitle = mockMetrics[0].title;
const secondTitle = mockMetrics[1].title; const secondTitle = mockMetrics[1].title;
...@@ -384,10 +358,6 @@ describe('Dashboard', () => { ...@@ -384,10 +358,6 @@ describe('Dashboard', () => {
wrapper.vm.$nextTick(done); wrapper.vm.$nextTick(done);
}); });
afterEach(() => {
wrapper.destroy();
});
it('renders correctly', () => { it('renders correctly', () => {
expect(wrapper.isVueInstance()).toBe(true); expect(wrapper.isVueInstance()).toBe(true);
expect(wrapper.exists()).toBe(true); expect(wrapper.exists()).toBe(true);
...@@ -398,8 +368,6 @@ describe('Dashboard', () => { ...@@ -398,8 +368,6 @@ describe('Dashboard', () => {
const findEditLink = () => wrapper.find('.js-edit-link'); const findEditLink = () => wrapper.find('.js-edit-link');
beforeEach(done => { beforeEach(done => {
mock.onGet(mockApiEndpoint).reply(statusCodes.OK, metricsGroupsAPIResponse);
createShallowWrapper({ hasMetrics: true }); createShallowWrapper({ hasMetrics: true });
wrapper.vm.$store.commit( wrapper.vm.$store.commit(
...@@ -409,10 +377,6 @@ describe('Dashboard', () => { ...@@ -409,10 +377,6 @@ describe('Dashboard', () => {
wrapper.vm.$nextTick(done); wrapper.vm.$nextTick(done);
}); });
afterEach(() => {
wrapper.destroy();
});
it('is not present for the default dashboard', () => { it('is not present for the default dashboard', () => {
expect(findEditLink().exists()).toBe(false); expect(findEditLink().exists()).toBe(false);
}); });
...@@ -435,8 +399,6 @@ describe('Dashboard', () => { ...@@ -435,8 +399,6 @@ describe('Dashboard', () => {
describe('Dashboard dropdown', () => { describe('Dashboard dropdown', () => {
beforeEach(() => { beforeEach(() => {
mock.onGet(mockApiEndpoint).reply(200, metricsGroupsAPIResponse);
createMountedWrapper({ hasMetrics: true }, { stubs: ['graph-group', 'panel-type'] }); createMountedWrapper({ hasMetrics: true }, { stubs: ['graph-group', 'panel-type'] });
wrapper.vm.$store.commit( wrapper.vm.$store.commit(
...@@ -460,8 +422,6 @@ describe('Dashboard', () => { ...@@ -460,8 +422,6 @@ describe('Dashboard', () => {
describe('external dashboard link', () => { describe('external dashboard link', () => {
beforeEach(() => { beforeEach(() => {
mock.onGet(mockApiEndpoint).reply(200, metricsGroupsAPIResponse);
createMountedWrapper( createMountedWrapper(
{ {
hasMetrics: true, hasMetrics: true,
...@@ -497,17 +457,11 @@ describe('Dashboard', () => { ...@@ -497,17 +457,11 @@ describe('Dashboard', () => {
const clipboardText = () => link().element.dataset.clipboardText; const clipboardText = () => link().element.dataset.clipboardText;
beforeEach(done => { beforeEach(done => {
mock.onGet(mockApiEndpoint).reply(200, metricsGroupsAPIResponse);
createShallowWrapper({ hasMetrics: true, currentDashboard }); createShallowWrapper({ hasMetrics: true, currentDashboard });
setTimeout(done); setTimeout(done);
}); });
afterEach(() => {
wrapper.destroy();
});
it('adds a copy button to the dropdown', () => { it('adds a copy button to the dropdown', () => {
expect(link().text()).toContain('Generate link to chart'); expect(link().text()).toContain('Generate link to chart');
}); });
......
...@@ -6,7 +6,7 @@ import statusCodes from '~/lib/utils/http_status'; ...@@ -6,7 +6,7 @@ import statusCodes from '~/lib/utils/http_status';
import Dashboard from '~/monitoring/components/dashboard.vue'; import Dashboard from '~/monitoring/components/dashboard.vue';
import { createStore } from '~/monitoring/stores'; import { createStore } from '~/monitoring/stores';
import { propsData, setupComponentStore } from '../init_utils'; import { propsData, setupComponentStore } from '../init_utils';
import { metricsGroupsAPIResponse, mockApiEndpoint } from '../mock_data'; import { metricsDashboardPayload, mockApiEndpoint } from '../mock_data';
jest.mock('~/lib/utils/url_utility', () => ({ jest.mock('~/lib/utils/url_utility', () => ({
getParameterValues: jest.fn().mockImplementation(param => { getParameterValues: jest.fn().mockImplementation(param => {
...@@ -43,7 +43,7 @@ describe('dashboard time window', () => { ...@@ -43,7 +43,7 @@ describe('dashboard time window', () => {
}); });
it('shows an error message if invalid url parameters are passed', done => { it('shows an error message if invalid url parameters are passed', done => {
mock.onGet(mockApiEndpoint).reply(statusCodes.OK, metricsGroupsAPIResponse); mock.onGet(mockApiEndpoint).reply(statusCodes.OK, metricsDashboardPayload);
createComponentWrapperMounted({ hasMetrics: true }, { stubs: ['graph-group', 'panel-type'] }); createComponentWrapperMounted({ hasMetrics: true }, { stubs: ['graph-group', 'panel-type'] });
......
import * as types from '~/monitoring/stores/mutation_types'; import * as types from '~/monitoring/stores/mutation_types';
import { import {
metricsGroupsAPIResponse, metricsDashboardPayload,
mockedEmptyResult, mockedEmptyResult,
mockedQueryResultPayload, mockedQueryResultPayload,
mockedQueryResultPayloadCoresTotal, mockedQueryResultPayloadCoresTotal,
...@@ -23,7 +23,7 @@ export const propsData = { ...@@ -23,7 +23,7 @@ export const propsData = {
emptyNoDataSvgPath: '/path/to/no-data.svg', emptyNoDataSvgPath: '/path/to/no-data.svg',
emptyNoDataSmallSvgPath: '/path/to/no-data-small.svg', emptyNoDataSmallSvgPath: '/path/to/no-data-small.svg',
emptyUnableToConnectSvgPath: '/path/to/unable-to-connect.svg', emptyUnableToConnectSvgPath: '/path/to/unable-to-connect.svg',
environmentsEndpoint: '/root/hello-prometheus/environments/35', environmentsEndpoint: '/root/hello-prometheus/-/environments.json',
currentEnvironmentName: 'production', currentEnvironmentName: 'production',
customMetricsAvailable: false, customMetricsAvailable: false,
customMetricsPath: '', customMetricsPath: '',
...@@ -33,7 +33,7 @@ export const propsData = { ...@@ -33,7 +33,7 @@ export const propsData = {
export const setupComponentStore = wrapper => { export const setupComponentStore = wrapper => {
wrapper.vm.$store.commit( wrapper.vm.$store.commit(
`monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`, `monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`,
metricsGroupsAPIResponse, metricsDashboardPayload,
); );
// Load 3 panels to the dashboard, one with an empty result // Load 3 panels to the dashboard, one with an empty result
......
...@@ -331,81 +331,6 @@ export const mockedQueryResultPayloadCoresTotal = { ...@@ -331,81 +331,6 @@ export const mockedQueryResultPayloadCoresTotal = {
], ],
}; };
export const metricsGroupsAPIResponse = {
dashboard: 'Environment metrics',
panel_groups: [
{
group: 'Response metrics (NGINX Ingress VTS)',
priority: 10,
panels: [
{
metrics: [
{
id: 'response_metrics_nginx_ingress_throughput_status_code',
label: 'Status Code',
metric_id: 1,
prometheus_endpoint_path:
'/root/autodevops-deploy/environments/32/prometheus/api/v1/query_range?query=sum%28rate%28nginx_upstream_responses_total%7Bupstream%3D~%22%25%7Bkube_namespace%7D-%25%7Bci_environment_slug%7D-.%2A%22%7D%5B2m%5D%29%29+by+%28status_code%29',
query_range:
'sum(rate(nginx_upstream_responses_total{upstream=~"%{kube_namespace}-%{ci_environment_slug}-.*"}[2m])) by (status_code)',
unit: 'req / sec',
},
],
title: 'Throughput',
type: 'area-chart',
weight: 1,
y_label: 'Requests / Sec',
},
],
},
{
group: 'System metrics (Kubernetes)',
priority: 5,
panels: [
{
title: 'Memory Usage (Pod average)',
type: 'area-chart',
y_label: 'Memory Used per Pod',
weight: 2,
metrics: [
{
id: 'system_metrics_kubernetes_container_memory_average',
query_range:
'avg(sum(container_memory_usage_bytes{container_name!="POD",pod_name=~"^%{ci_environment_slug}-([^c].*|c([^a]|a([^n]|n([^a]|a([^r]|r[^y])))).*|)-(.*)",namespace="%{kube_namespace}"}) by (job)) without (job) / count(avg(container_memory_usage_bytes{container_name!="POD",pod_name=~"^%{ci_environment_slug}-([^c].*|c([^a]|a([^n]|n([^a]|a([^r]|r[^y])))).*|)-(.*)",namespace="%{kube_namespace}"}) without (job)) /1024/1024',
label: 'Pod average',
unit: 'MB',
metric_id: 17,
prometheus_endpoint_path:
'/root/autodevops-deploy/environments/32/prometheus/api/v1/query_range?query=avg%28sum%28container_memory_usage_bytes%7Bcontainer_name%21%3D%22POD%22%2Cpod_name%3D~%22%5E%25%7Bci_environment_slug%7D-%28%5B%5Ec%5D.%2A%7Cc%28%5B%5Ea%5D%7Ca%28%5B%5En%5D%7Cn%28%5B%5Ea%5D%7Ca%28%5B%5Er%5D%7Cr%5B%5Ey%5D%29%29%29%29.%2A%7C%29-%28.%2A%29%22%2Cnamespace%3D%22%25%7Bkube_namespace%7D%22%7D%29+by+%28job%29%29+without+%28job%29+%2F+count%28avg%28container_memory_usage_bytes%7Bcontainer_name%21%3D%22POD%22%2Cpod_name%3D~%22%5E%25%7Bci_environment_slug%7D-%28%5B%5Ec%5D.%2A%7Cc%28%5B%5Ea%5D%7Ca%28%5B%5En%5D%7Cn%28%5B%5Ea%5D%7Ca%28%5B%5Er%5D%7Cr%5B%5Ey%5D%29%29%29%29.%2A%7C%29-%28.%2A%29%22%2Cnamespace%3D%22%25%7Bkube_namespace%7D%22%7D%29+without+%28job%29%29+%2F1024%2F1024',
appearance: {
line: {
width: 2,
},
},
},
],
},
{
title: 'Core Usage (Total)',
type: 'area-chart',
y_label: 'Total Cores',
weight: 3,
metrics: [
{
id: 'system_metrics_kubernetes_container_cores_total',
query_range:
'avg(sum(rate(container_cpu_usage_seconds_total{container_name!="POD",pod_name=~"^%{ci_environment_slug}-(.*)",namespace="%{kube_namespace}"}[15m])) by (job)) without (job)',
label: 'Total',
unit: 'cores',
metric_id: 13,
},
],
},
],
},
],
};
export const environmentData = [ export const environmentData = [
{ {
id: 34, id: 34,
...@@ -517,6 +442,81 @@ export const metricsDashboardResponse = { ...@@ -517,6 +442,81 @@ export const metricsDashboardResponse = {
status: 'success', status: 'success',
}; };
export const metricsDashboardPayload = {
dashboard: 'Environment metrics',
panel_groups: [
{
group: 'Response metrics (NGINX Ingress VTS)',
priority: 10,
panels: [
{
metrics: [
{
id: 'response_metrics_nginx_ingress_throughput_status_code',
label: 'Status Code',
metric_id: 1,
prometheus_endpoint_path:
'/root/autodevops-deploy/environments/32/prometheus/api/v1/query_range?query=sum%28rate%28nginx_upstream_responses_total%7Bupstream%3D~%22%25%7Bkube_namespace%7D-%25%7Bci_environment_slug%7D-.%2A%22%7D%5B2m%5D%29%29+by+%28status_code%29',
query_range:
'sum(rate(nginx_upstream_responses_total{upstream=~"%{kube_namespace}-%{ci_environment_slug}-.*"}[2m])) by (status_code)',
unit: 'req / sec',
},
],
title: 'Throughput',
type: 'area-chart',
weight: 1,
y_label: 'Requests / Sec',
},
],
},
{
group: 'System metrics (Kubernetes)',
priority: 5,
panels: [
{
title: 'Memory Usage (Pod average)',
type: 'area-chart',
y_label: 'Memory Used per Pod',
weight: 2,
metrics: [
{
id: 'system_metrics_kubernetes_container_memory_average',
query_range:
'avg(sum(container_memory_usage_bytes{container_name!="POD",pod_name=~"^%{ci_environment_slug}-([^c].*|c([^a]|a([^n]|n([^a]|a([^r]|r[^y])))).*|)-(.*)",namespace="%{kube_namespace}"}) by (job)) without (job) / count(avg(container_memory_usage_bytes{container_name!="POD",pod_name=~"^%{ci_environment_slug}-([^c].*|c([^a]|a([^n]|n([^a]|a([^r]|r[^y])))).*|)-(.*)",namespace="%{kube_namespace}"}) without (job)) /1024/1024',
label: 'Pod average',
unit: 'MB',
metric_id: 17,
prometheus_endpoint_path:
'/root/autodevops-deploy/environments/32/prometheus/api/v1/query_range?query=avg%28sum%28container_memory_usage_bytes%7Bcontainer_name%21%3D%22POD%22%2Cpod_name%3D~%22%5E%25%7Bci_environment_slug%7D-%28%5B%5Ec%5D.%2A%7Cc%28%5B%5Ea%5D%7Ca%28%5B%5En%5D%7Cn%28%5B%5Ea%5D%7Ca%28%5B%5Er%5D%7Cr%5B%5Ey%5D%29%29%29%29.%2A%7C%29-%28.%2A%29%22%2Cnamespace%3D%22%25%7Bkube_namespace%7D%22%7D%29+by+%28job%29%29+without+%28job%29+%2F+count%28avg%28container_memory_usage_bytes%7Bcontainer_name%21%3D%22POD%22%2Cpod_name%3D~%22%5E%25%7Bci_environment_slug%7D-%28%5B%5Ec%5D.%2A%7Cc%28%5B%5Ea%5D%7Ca%28%5B%5En%5D%7Cn%28%5B%5Ea%5D%7Ca%28%5B%5Er%5D%7Cr%5B%5Ey%5D%29%29%29%29.%2A%7C%29-%28.%2A%29%22%2Cnamespace%3D%22%25%7Bkube_namespace%7D%22%7D%29+without+%28job%29%29+%2F1024%2F1024',
appearance: {
line: {
width: 2,
},
},
},
],
},
{
title: 'Core Usage (Total)',
type: 'area-chart',
y_label: 'Total Cores',
weight: 3,
metrics: [
{
id: 'system_metrics_kubernetes_container_cores_total',
query_range:
'avg(sum(rate(container_cpu_usage_seconds_total{container_name!="POD",pod_name=~"^%{ci_environment_slug}-(.*)",namespace="%{kube_namespace}"}[15m])) by (job)) without (job)',
label: 'Total',
unit: 'cores',
metric_id: 13,
},
],
},
],
},
],
};
export const dashboardGitResponse = [ export const dashboardGitResponse = [
{ {
default: true, default: true,
......
...@@ -25,7 +25,7 @@ import { ...@@ -25,7 +25,7 @@ import {
deploymentData, deploymentData,
environmentData, environmentData,
metricsDashboardResponse, metricsDashboardResponse,
metricsGroupsAPIResponse, metricsDashboardPayload,
dashboardGitResponse, dashboardGitResponse,
} from '../mock_data'; } from '../mock_data';
...@@ -442,7 +442,7 @@ describe('Monitoring store actions', () => { ...@@ -442,7 +442,7 @@ describe('Monitoring store actions', () => {
beforeEach(() => { beforeEach(() => {
state = storeState(); state = storeState();
[metric] = metricsDashboardResponse.dashboard.panel_groups[0].panels[0].metrics; [metric] = metricsDashboardResponse.dashboard.panel_groups[0].panels[0].metrics;
[data] = metricsGroupsAPIResponse.panel_groups[0].panels[0].metrics; [data] = metricsDashboardPayload.panel_groups[0].panels[0].metrics;
}); });
it('commits result', done => { it('commits result', done => {
......
...@@ -3,7 +3,7 @@ import mutations from '~/monitoring/stores/mutations'; ...@@ -3,7 +3,7 @@ import mutations from '~/monitoring/stores/mutations';
import * as types from '~/monitoring/stores/mutation_types'; import * as types from '~/monitoring/stores/mutation_types';
import { metricStates } from '~/monitoring/constants'; import { metricStates } from '~/monitoring/constants';
import { import {
metricsGroupsAPIResponse, metricsDashboardPayload,
mockedEmptyResult, mockedEmptyResult,
mockedQueryResultPayload, mockedQueryResultPayload,
mockedQueryResultPayloadCoresTotal, mockedQueryResultPayloadCoresTotal,
...@@ -44,7 +44,7 @@ describe('Monitoring store Getters', () => { ...@@ -44,7 +44,7 @@ describe('Monitoring store Getters', () => {
setupState({ setupState({
dashboard: { panel_groups: [] }, dashboard: { panel_groups: [] },
}); });
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
groups = state.dashboard.panel_groups; groups = state.dashboard.panel_groups;
}); });
...@@ -53,21 +53,21 @@ describe('Monitoring store Getters', () => { ...@@ -53,21 +53,21 @@ describe('Monitoring store Getters', () => {
}); });
it('on an empty metric with no result, returns NO_DATA', () => { it('on an empty metric with no result, returns NO_DATA', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedEmptyResult); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedEmptyResult);
expect(getMetricStates()).toEqual([metricStates.NO_DATA]); expect(getMetricStates()).toEqual([metricStates.NO_DATA]);
}); });
it('on a metric with a result, returns OK', () => { it('on a metric with a result, returns OK', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload);
expect(getMetricStates()).toEqual([metricStates.OK]); expect(getMetricStates()).toEqual([metricStates.OK]);
}); });
it('on a metric with an error, returns an error', () => { it('on a metric with an error, returns an error', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_FAILURE](state, { mutations[types.RECEIVE_METRIC_RESULT_FAILURE](state, {
metricId: groups[0].panels[0].metrics[0].metricId, metricId: groups[0].panels[0].metrics[0].metricId,
}); });
...@@ -76,7 +76,7 @@ describe('Monitoring store Getters', () => { ...@@ -76,7 +76,7 @@ describe('Monitoring store Getters', () => {
}); });
it('on multiple metrics with results, returns OK', () => { it('on multiple metrics with results, returns OK', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayloadCoresTotal); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayloadCoresTotal);
...@@ -87,7 +87,7 @@ describe('Monitoring store Getters', () => { ...@@ -87,7 +87,7 @@ describe('Monitoring store Getters', () => {
expect(getMetricStates(state.dashboard.panel_groups[1].key)).toEqual([metricStates.OK]); expect(getMetricStates(state.dashboard.panel_groups[1].key)).toEqual([metricStates.OK]);
}); });
it('on multiple metrics errors', () => { it('on multiple metrics errors', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_FAILURE](state, { mutations[types.RECEIVE_METRIC_RESULT_FAILURE](state, {
metricId: groups[0].panels[0].metrics[0].metricId, metricId: groups[0].panels[0].metrics[0].metricId,
...@@ -106,7 +106,7 @@ describe('Monitoring store Getters', () => { ...@@ -106,7 +106,7 @@ describe('Monitoring store Getters', () => {
}); });
it('on multiple metrics with errors', () => { it('on multiple metrics with errors', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
// An success in 1 group // An success in 1 group
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload);
...@@ -168,27 +168,27 @@ describe('Monitoring store Getters', () => { ...@@ -168,27 +168,27 @@ describe('Monitoring store Getters', () => {
}); });
it('no loaded metric returns empty', () => { it('no loaded metric returns empty', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
expect(metricsWithData()).toEqual([]); expect(metricsWithData()).toEqual([]);
}); });
it('an empty metric, returns empty', () => { it('an empty metric, returns empty', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedEmptyResult); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedEmptyResult);
expect(metricsWithData()).toEqual([]); expect(metricsWithData()).toEqual([]);
}); });
it('a metric with results, it returns a metric', () => { it('a metric with results, it returns a metric', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload);
expect(metricsWithData()).toEqual([mockedQueryResultPayload.metricId]); expect(metricsWithData()).toEqual([mockedQueryResultPayload.metricId]);
}); });
it('multiple metrics with results, it return multiple metrics', () => { it('multiple metrics with results, it return multiple metrics', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayloadCoresTotal); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayloadCoresTotal);
...@@ -199,7 +199,7 @@ describe('Monitoring store Getters', () => { ...@@ -199,7 +199,7 @@ describe('Monitoring store Getters', () => {
}); });
it('multiple metrics with results, it returns metrics filtered by group', () => { it('multiple metrics with results, it returns metrics filtered by group', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsGroupsAPIResponse); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](state, metricsDashboardPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayload);
mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayloadCoresTotal); mutations[types.RECEIVE_METRIC_RESULT_SUCCESS](state, mockedQueryResultPayloadCoresTotal);
......
...@@ -5,7 +5,7 @@ import * as types from '~/monitoring/stores/mutation_types'; ...@@ -5,7 +5,7 @@ import * as types from '~/monitoring/stores/mutation_types';
import state from '~/monitoring/stores/state'; import state from '~/monitoring/stores/state';
import { metricStates } from '~/monitoring/constants'; import { metricStates } from '~/monitoring/constants';
import { import {
metricsGroupsAPIResponse, metricsDashboardPayload,
deploymentData, deploymentData,
metricsDashboardResponse, metricsDashboardResponse,
dashboardGitResponse, dashboardGitResponse,
...@@ -23,7 +23,7 @@ describe('Monitoring mutations', () => { ...@@ -23,7 +23,7 @@ describe('Monitoring mutations', () => {
beforeEach(() => { beforeEach(() => {
stateCopy.dashboard.panel_groups = []; stateCopy.dashboard.panel_groups = [];
payload = metricsGroupsAPIResponse; payload = metricsDashboardPayload;
}); });
it('adds a key to the group', () => { it('adds a key to the group', () => {
mutations[types.RECEIVE_METRICS_DATA_SUCCESS](stateCopy, payload); mutations[types.RECEIVE_METRICS_DATA_SUCCESS](stateCopy, payload);
......
...@@ -6,7 +6,7 @@ import * as types from '~/monitoring/stores/mutation_types'; ...@@ -6,7 +6,7 @@ import * as types from '~/monitoring/stores/mutation_types';
import { createStore } from '~/monitoring/stores'; import { createStore } from '~/monitoring/stores';
import axios from '~/lib/utils/axios_utils'; import axios from '~/lib/utils/axios_utils';
import { import {
metricsGroupsAPIResponse, metricsDashboardPayload,
mockedEmptyResult, mockedEmptyResult,
mockedQueryResultPayload, mockedQueryResultPayload,
mockedQueryResultPayloadCoresTotal, mockedQueryResultPayloadCoresTotal,
...@@ -41,7 +41,7 @@ function setupComponentStore(component) { ...@@ -41,7 +41,7 @@ function setupComponentStore(component) {
// Load 2 panel groups // Load 2 panel groups
component.$store.commit( component.$store.commit(
`monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`, `monitoringDashboard/${types.RECEIVE_METRICS_DATA_SUCCESS}`,
metricsGroupsAPIResponse, metricsDashboardPayload,
); );
// Load 3 panels to the dashboard, one with an empty result // Load 3 panels to the dashboard, one with an empty result
...@@ -98,7 +98,7 @@ describe('Dashboard', () => { ...@@ -98,7 +98,7 @@ describe('Dashboard', () => {
let panelToggle; let panelToggle;
let chart; let chart;
beforeEach(() => { beforeEach(() => {
mock.onGet(mockApiEndpoint).reply(200, metricsGroupsAPIResponse); mock.onGet(mockApiEndpoint).reply(200, metricsDashboardPayload);
component = new DashboardComponent({ component = new DashboardComponent({
el: document.querySelector('.prometheus-graphs'), el: document.querySelector('.prometheus-graphs'),
......
...@@ -5,6 +5,14 @@ require 'spec_helper' ...@@ -5,6 +5,14 @@ require 'spec_helper'
describe Gitlab::Email::Receiver do describe Gitlab::Email::Receiver do
include_context :email_shared_context include_context :email_shared_context
shared_examples 'correctly finds the mail key' do
specify do
expect(Gitlab::Email::Handler).to receive(:for).with(an_instance_of(Mail::Message), 'gitlabhq/gitlabhq+auth_token').and_return(handler)
receiver.execute
end
end
context 'when the email contains a valid email address in a header' do context 'when the email contains a valid email address in a header' do
let(:handler) { double(:handler) } let(:handler) { double(:handler) }
......
...@@ -16,6 +16,40 @@ describe Gitlab::MailRoom do ...@@ -16,6 +16,40 @@ describe Gitlab::MailRoom do
} }
end end
shared_examples_for 'only truthy if both enabled and address are truthy' do |target_proc|
context 'with both enabled and address as truthy values' do
it 'is truthy' do
stub_config(enabled: true, address: 'localhost')
expect(target_proc.call).to be_truthy
end
end
context 'with address only as truthy' do
it 'is falsey' do
stub_config(enabled: false, address: 'localhost')
expect(target_proc.call).to be_falsey
end
end
context 'with enabled only as truthy' do
it 'is falsey' do
stub_config(enabled: true, address: nil)
expect(target_proc.call).to be_falsey
end
end
context 'with neither address nor enabled as truthy' do
it 'is falsey' do
stub_config(enabled: false, address: nil)
expect(target_proc.call).to be_falsey
end
end
end
before do before do
described_class.reset_config! described_class.reset_config!
allow(File).to receive(:exist?).and_return true allow(File).to receive(:exist?).and_return true
......
# frozen_string_literal: true # frozen_string_literal: true
require 'fast_spec_helper' require 'fast_spec_helper'
require 'support/shared_examples/malicious_regexp_shared_examples' require 'support/shared_examples/lib/gitlab/malicious_regexp_shared_examples'
require 'support/helpers/stub_feature_flags' require 'support/helpers/stub_feature_flags'
describe Gitlab::UntrustedRegexp::RubySyntax do describe Gitlab::UntrustedRegexp::RubySyntax do
......
# frozen_string_literal: true # frozen_string_literal: true
require 'fast_spec_helper' require 'fast_spec_helper'
require 'support/shared_examples/malicious_regexp_shared_examples' require 'support/shared_examples/lib/gitlab/malicious_regexp_shared_examples'
describe Gitlab::UntrustedRegexp do describe Gitlab::UntrustedRegexp do
describe '#initialize' do describe '#initialize' do
......
...@@ -27,7 +27,42 @@ describe Ci::BuildTraceChunk, :clean_gitlab_redis_shared_state do ...@@ -27,7 +27,42 @@ describe Ci::BuildTraceChunk, :clean_gitlab_redis_shared_state do
let(:build) { create(:ci_build, :running, :trace_live, pipeline: pipeline, project: parent) } let(:build) { create(:ci_build, :running, :trace_live, pipeline: pipeline, project: parent) }
let(:subjects) { build.trace_chunks } let(:subjects) { build.trace_chunks }
it_behaves_like 'fast destroyable' describe 'Forbid #destroy and #destroy_all' do
it 'does not delete database rows and associted external data' do
expect(external_data_counter).to be > 0
expect(subjects.count).to be > 0
expect { subjects.first.destroy }.to raise_error('`destroy` and `destroy_all` are forbidden. Please use `fast_destroy_all`')
expect { subjects.destroy_all }.to raise_error('`destroy` and `destroy_all` are forbidden. Please use `fast_destroy_all`') # rubocop: disable DestroyAll
expect(subjects.count).to be > 0
expect(external_data_counter).to be > 0
end
end
describe '.fast_destroy_all' do
it 'deletes database rows and associted external data' do
expect(external_data_counter).to be > 0
expect(subjects.count).to be > 0
expect { subjects.fast_destroy_all }.not_to raise_error
expect(subjects.count).to eq(0)
expect(external_data_counter).to eq(0)
end
end
describe '.use_fast_destroy' do
it 'performs cascading delete with fast_destroy_all' do
expect(external_data_counter).to be > 0
expect(subjects.count).to be > 0
expect { parent.destroy }.not_to raise_error
expect(subjects.count).to eq(0)
expect(external_data_counter).to eq(0)
end
end
def external_data_counter def external_data_counter
Gitlab::Redis::SharedState.with do |redis| Gitlab::Redis::SharedState.with do |redis|
......
...@@ -176,6 +176,35 @@ describe Projects::UpdatePagesService do ...@@ -176,6 +176,35 @@ describe Projects::UpdatePagesService do
describe 'maximum pages artifacts size' do describe 'maximum pages artifacts size' do
let(:metadata) { spy('metadata') } let(:metadata) { spy('metadata') }
shared_examples 'pages size limit is' do |size_limit|
context "when size is below the limit" do
before do
allow(metadata).to receive(:total_size).and_return(size_limit - 1.megabyte)
end
it 'updates pages correctly' do
subject.execute
expect(deploy_status.description).not_to be_present
expect(project.pages_metadatum).to be_deployed
end
end
context "when size is above the limit" do
before do
allow(metadata).to receive(:total_size).and_return(size_limit + 1.megabyte)
end
it 'limits the maximum size of gitlab pages' do
subject.execute
expect(deploy_status.description)
.to match(/artifacts for pages are too large/)
expect(deploy_status).to be_script_failure
end
end
end
before do before do
file = fixture_file_upload('spec/fixtures/pages.zip') file = fixture_file_upload('spec/fixtures/pages.zip')
metafile = fixture_file_upload('spec/fixtures/pages.zip.meta') metafile = fixture_file_upload('spec/fixtures/pages.zip.meta')
......
...@@ -4,7 +4,7 @@ ...@@ -4,7 +4,7 @@
# #
# Requires a reference: # Requires a reference:
# let(:reference) { '#42' } # let(:reference) { '#42' }
shared_examples 'a reference containing an element node' do RSpec.shared_examples 'a reference containing an element node' do
let(:inner_html) { 'element <code>node</code> inside' } let(:inner_html) { 'element <code>node</code> inside' }
let(:reference_with_element) { %{<a href="#{reference}">#{inner_html}</a>} } let(:reference_with_element) { %{<a href="#{reference}">#{inner_html}</a>} }
...@@ -18,7 +18,7 @@ end ...@@ -18,7 +18,7 @@ end
# subject { create(:user) } # subject { create(:user) }
# let(:reference) { subject.to_reference } # let(:reference) { subject.to_reference }
# let(:subject_name) { 'user' } # let(:subject_name) { 'user' }
shared_examples 'user reference or project reference' do RSpec.shared_examples 'user reference or project reference' do
shared_examples 'it contains a data- attribute' do shared_examples 'it contains a data- attribute' do
it 'includes a data- attribute' do it 'includes a data- attribute' do
doc = reference_filter("Hey #{reference}") doc = reference_filter("Hey #{reference}")
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'valid cluster create params' do RSpec.shared_context 'valid cluster create params' do
let(:params) do let(:params) do
{ {
name: 'test-cluster', name: 'test-cluster',
...@@ -16,7 +16,7 @@ shared_context 'valid cluster create params' do ...@@ -16,7 +16,7 @@ shared_context 'valid cluster create params' do
end end
end end
shared_context 'invalid cluster create params' do RSpec.shared_context 'invalid cluster create params' do
let(:params) do let(:params) do
{ {
name: 'test-cluster', name: 'test-cluster',
...@@ -31,7 +31,7 @@ shared_context 'invalid cluster create params' do ...@@ -31,7 +31,7 @@ shared_context 'invalid cluster create params' do
end end
end end
shared_examples 'create cluster service success' do RSpec.shared_examples 'create cluster service success' do
it 'creates a cluster object and performs a worker' do it 'creates a cluster object and performs a worker' do
expect(ClusterProvisionWorker).to receive(:perform_async) expect(ClusterProvisionWorker).to receive(:perform_async)
...@@ -53,7 +53,7 @@ shared_examples 'create cluster service success' do ...@@ -53,7 +53,7 @@ shared_examples 'create cluster service success' do
end end
end end
shared_examples 'create cluster service error' do RSpec.shared_examples 'create cluster service error' do
it 'returns an error' do it 'returns an error' do
expect(ClusterProvisionWorker).not_to receive(:perform_async) expect(ClusterProvisionWorker).not_to receive(:perform_async)
expect { subject }.to change { Clusters::Cluster.count }.by(0) expect { subject }.to change { Clusters::Cluster.count }.by(0)
......
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
# Specifications for behavior common to all objects with executable attributes. # Specifications for behavior common to all objects with executable attributes.
# It can take a `default_params`. # It can take a `default_params`.
shared_examples 'new issuable record that supports quick actions' do RSpec.shared_examples 'new issuable record that supports quick actions' do
let!(:project) { create(:project, :repository) } let!(:project) { create(:project, :repository) }
let(:user) { create(:user).tap { |u| project.add_maintainer(u) } } let(:user) { create(:user).tap { |u| project.add_maintainer(u) } }
let(:assignee) { create(:user) } let(:assignee) { create(:user) }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'issuable update service' do RSpec.shared_examples 'issuable update service' do
def update_issuable(opts) def update_issuable(opts)
described_class.new(project, user, opts).execute(open_issuable) described_class.new(project, user, opts).execute(open_issuable)
end end
......
# frozen_string_literal: true # frozen_string_literal: true
require "spec_helper" RSpec.shared_examples "migrating a deleted user's associated records to the ghost user" do |record_class, fields|
shared_examples "migrating a deleted user's associated records to the ghost user" do |record_class, fields|
record_class_name = record_class.to_s.titleize.downcase record_class_name = record_class.to_s.titleize.downcase
let(:project) do let(:project) do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'change access checks context' do RSpec.shared_context 'change access checks context' do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:project) { create(:project, :repository) } let(:project) { create(:project, :repository) }
let(:user_access) { Gitlab::UserAccess.new(user, project: project) } let(:user_access) { Gitlab::UserAccess.new(user, project: project) }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'a GitHub-ish import controller' do RSpec.shared_context 'a GitHub-ish import controller' do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:token) { "asdasd12345" } let(:token) { "asdasd12345" }
let(:access_params) { { github_access_token: token } } let(:access_params) { { github_access_token: token } }
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper' RSpec.shared_context 'Ldap::OmniauthCallbacksController' do
shared_context 'Ldap::OmniauthCallbacksController' do
include LoginHelpers include LoginHelpers
include LdapHelpers include LdapHelpers
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context :email_shared_context do RSpec.shared_context :email_shared_context do
let(:mail_key) { "59d8df8370b7e95c5a49fbf86aeb2c93" } let(:mail_key) { "59d8df8370b7e95c5a49fbf86aeb2c93" }
let(:receiver) { Gitlab::Email::Receiver.new(email_raw) } let(:receiver) { Gitlab::Email::Receiver.new(email_raw) }
let(:markdown) { "![image](uploads/image.png)" } let(:markdown) { "![image](uploads/image.png)" }
...@@ -18,7 +18,7 @@ shared_context :email_shared_context do ...@@ -18,7 +18,7 @@ shared_context :email_shared_context do
end end
end end
shared_examples :reply_processing_shared_examples do RSpec.shared_examples :reply_processing_shared_examples do
context "when the user could not be found" do context "when the user could not be found" do
before do before do
user.destroy user.destroy
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper'
RSpec.shared_context 'GroupProjectsFinder context' do RSpec.shared_context 'GroupProjectsFinder context' do
let(:group) { create(:group) } let(:group) { create(:group) }
let(:subgroup) { create(:group, parent: group) } let(:subgroup) { create(:group, parent: group) }
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper'
RSpec.shared_context 'IssuesFinder context' do RSpec.shared_context 'IssuesFinder context' do
set(:user) { create(:user) } set(:user) { create(:user) }
set(:user2) { create(:user) } set(:user2) { create(:user) }
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper'
RSpec.shared_context 'MergeRequestsFinder multiple projects with merge requests context' do RSpec.shared_context 'MergeRequestsFinder multiple projects with merge requests context' do
include ProjectForksHelper include ProjectForksHelper
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper'
RSpec.shared_context 'UsersFinder#execute filter by project context' do RSpec.shared_context 'UsersFinder#execute filter by project context' do
set(:normal_user) { create(:user, username: 'johndoe') } set(:normal_user) { create(:user, username: 'johndoe') }
set(:blocked_user) { create(:user, :blocked, username: 'notsorandom') } set(:blocked_user) { create(:user, :blocked, username: 'notsorandom') }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'JSON response' do RSpec.shared_context 'JSON response' do
let(:json_response) { JSON.parse(response.body) } let(:json_response) { JSON.parse(response.body) }
end end
# frozen_string_literal: true
RSpec.shared_context 'gitlab email notification' do
set(:group) { create(:group) }
set(:subgroup) { create(:group, parent: group) }
set(:project) { create(:project, :repository, name: 'a-known-name', group: group) }
set(:recipient) { create(:user, email: 'recipient@example.com') }
let(:gitlab_sender_display_name) { Gitlab.config.gitlab.email_display_name }
let(:gitlab_sender) { Gitlab.config.gitlab.email_from }
let(:gitlab_sender_reply_to) { Gitlab.config.gitlab.email_reply_to }
let(:new_user_address) { 'newguy@example.com' }
before do
email = recipient.emails.create(email: "notifications@example.com")
recipient.update_attribute(:notification_email, email.email)
stub_incoming_email_setting(enabled: true, address: "reply+%{key}@#{Gitlab.config.gitlab.host}")
end
end
RSpec.shared_context 'reply-by-email is enabled with incoming address without %{key}' do
before do
stub_incoming_email_setting(enabled: true, address: "reply@#{Gitlab.config.gitlab.host}")
end
end
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'merge request create context' do RSpec.shared_context 'merge request create context' do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:user2) { create(:user) } let(:user2) { create(:user) }
let(:target_project) { create(:project, :public, :repository) } let(:target_project) { create(:project, :public, :repository) }
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper' RSpec.shared_context 'merge request edit context' do
shared_context 'merge request edit context' do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:user2) { create(:user) } let(:user2) { create(:user) }
let!(:milestone) { create(:milestone, project: target_project) } let!(:milestone) { create(:milestone, project: target_project) }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'merge request allowing collaboration' do RSpec.shared_context 'merge request allowing collaboration' do
include ProjectForksHelper include ProjectForksHelper
let(:canonical) { create(:project, :public, :repository) } let(:canonical) { create(:project, :public, :repository) }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'rack attack cache store' do RSpec.shared_context 'rack attack cache store' do
around do |example| around do |example|
# Instead of test environment's :null_store so the throttles can increment # Instead of test environment's :null_store so the throttles can increment
Rack::Attack.cache.store = ActiveSupport::Cache::MemoryStore.new Rack::Attack.cache.store = ActiveSupport::Cache::MemoryStore.new
......
# frozen_string_literal: true # frozen_string_literal: true
Service.available_services_names.each do |service| Service.available_services_names.each do |service|
shared_context service do RSpec.shared_context service do
let(:dashed_service) { service.dasherize } let(:dashed_service) { service.dasherize }
let(:service_method) { "#{service}_service".to_sym } let(:service_method) { "#{service}_service".to_sym }
let(:service_klass) { "#{service}_service".classify.constantize } let(:service_klass) { "#{service}_service".classify.constantize }
......
...@@ -4,7 +4,7 @@ ...@@ -4,7 +4,7 @@
# let(:session) variable # let(:session) variable
# we do not use a parameter such as |session| because it does not play nice # we do not use a parameter such as |session| because it does not play nice
# with let variables # with let variables
shared_context 'custom session' do RSpec.shared_context 'custom session' do
let!(:session) { {} } let!(:session) { {} }
around do |example| around do |example|
......
# frozen_string_literal: true
RSpec.shared_context 'unique ips sign in limit' do
include StubENV
let(:request_context) { Gitlab::RequestContext.instance }
before do
Gitlab::Redis::Cache.with(&:flushall)
Gitlab::Redis::Queues.with(&:flushall)
Gitlab::Redis::SharedState.with(&:flushall)
end
before do
stub_env('IN_MEMORY_APPLICATION_SETTINGS', 'false')
Gitlab::CurrentSettings.update!(
unique_ips_limit_enabled: true,
unique_ips_limit_time_window: 10000
)
# Make sure we're working with the same reqeust context everywhere
allow(Gitlab::RequestContext).to receive(:instance).and_return(request_context)
end
def change_ip(ip)
allow(request_context).to receive(:client_ip).and_return(ip)
end
def request_from_ip(ip)
change_ip(ip)
request
response
end
def operation_from_ip(ip)
change_ip(ip)
operation
end
end
...@@ -2,7 +2,7 @@ ...@@ -2,7 +2,7 @@
# Construct an `uploader` variable that is configured to `check_upload_type` # Construct an `uploader` variable that is configured to `check_upload_type`
# with `mime_types` and `extensions`. # with `mime_types` and `extensions`.
shared_context 'uploader with type check' do RSpec.shared_context 'uploader with type check' do
let(:uploader_class) do let(:uploader_class) do
Class.new(GitlabUploader) do Class.new(GitlabUploader) do
include UploadTypeCheck::Concern include UploadTypeCheck::Concern
...@@ -20,7 +20,7 @@ shared_context 'uploader with type check' do ...@@ -20,7 +20,7 @@ shared_context 'uploader with type check' do
end end
end end
shared_context 'stubbed MimeMagic mime type detection' do RSpec.shared_context 'stubbed MimeMagic mime type detection' do
let(:mime_type) { '' } let(:mime_type) { '' }
let(:magic_mime) { mime_type } let(:magic_mime) { mime_type }
let(:ext_mime) { mime_type } let(:ext_mime) { mime_type }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_context 'invalid urls' do RSpec.shared_context 'invalid urls' do
let(:urls_with_CRLF) do let(:urls_with_CRLF) do
["http://127.0.0.1:333/pa\rth", ["http://127.0.0.1:333/pa\rth",
"http://127.0.0.1:333/pa\nth", "http://127.0.0.1:333/pa\nth",
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples_for 'multiple issue boards' do RSpec.shared_examples 'multiple issue boards' do
context 'authorized user' do context 'authorized user' do
before do before do
parent.add_maintainer(user) parent.add_maintainer(user)
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'aborted merge requests for MWPS' do RSpec.shared_examples 'aborted merge requests for MWPS' do
let(:aborted_message) do let(:aborted_message) do
/aborted the automatic merge because target branch was updated/ /aborted the automatic merge because target branch was updated/
end end
...@@ -23,7 +23,7 @@ shared_examples 'aborted merge requests for MWPS' do ...@@ -23,7 +23,7 @@ shared_examples 'aborted merge requests for MWPS' do
end end
end end
shared_examples 'maintained merge requests for MWPS' do RSpec.shared_examples 'maintained merge requests for MWPS' do
it 'does not cancel auto merge' do it 'does not cancel auto merge' do
expect(merge_request.auto_merge_enabled?).to be_truthy expect(merge_request.auto_merge_enabled?).to be_truthy
expect(merge_request.notes).to be_empty expect(merge_request.notes).to be_empty
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples_for 'correct pipeline information for pipelines for merge requests' do RSpec.shared_examples 'correct pipeline information for pipelines for merge requests' do
context 'when pipeline for merge request' do context 'when pipeline for merge request' do
let(:pipeline) { merge_request.all_pipelines.first } let(:pipeline) { merge_request.all_pipelines.first }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'manual playable stage' do |stage_type| RSpec.shared_examples 'manual playable stage' do |stage_type|
let(:stage) { build(stage_type, status: status) } let(:stage) { build(stage_type, status: status) }
describe '#manual_playable?' do describe '#manual_playable?' do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'renders correct panels' do RSpec.shared_examples 'renders correct panels' do
it 'renders correct action on error' do it 'renders correct action on error' do
expect_next_instance_of(ApplicationSettings::UpdateService) do |service| expect_next_instance_of(ApplicationSettings::UpdateService) do |service|
allow(service).to receive(:execute).and_return(false) allow(service).to receive(:execute).and_return(false)
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper' RSpec.shared_examples 'discussions provider' do
shared_examples 'discussions provider' do
it 'returns the expected discussions' do it 'returns the expected discussions' do
get :discussions, params: { namespace_id: project.namespace, project_id: project, id: requested_iid } get :discussions, params: { namespace_id: project.namespace, project_id: project, id: requested_iid }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples_for 'successful response for #cancel_auto_stop' do RSpec.shared_examples 'successful response for #cancel_auto_stop' do
include GitlabRoutingHelper include GitlabRoutingHelper
context 'when request is html' do context 'when request is html' do
...@@ -42,7 +42,7 @@ shared_examples_for 'successful response for #cancel_auto_stop' do ...@@ -42,7 +42,7 @@ shared_examples_for 'successful response for #cancel_auto_stop' do
end end
end end
shared_examples_for 'failed response for #cancel_auto_stop' do RSpec.shared_examples 'failed response for #cancel_auto_stop' do
context 'when request is html' do context 'when request is html' do
let(:params) { environment_params(format: :html) } let(:params) { environment_params(format: :html) }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'sets the polling header' do RSpec.shared_examples 'sets the polling header' do
subject { response.headers[Gitlab::PollingInterval::HEADER_NAME] } subject { response.headers[Gitlab::PollingInterval::HEADER_NAME] }
it { is_expected.to eq '1000'} it { is_expected.to eq '1000'}
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper' RSpec.shared_examples 'disabled when using an external authorization service' do
shared_examples 'disabled when using an external authorization service' do
include ExternalAuthorizationServiceHelpers include ExternalAuthorizationServiceHelpers
it 'works when the feature is not enabled' do it 'works when the feature is not enabled' do
...@@ -20,7 +18,7 @@ shared_examples 'disabled when using an external authorization service' do ...@@ -20,7 +18,7 @@ shared_examples 'disabled when using an external authorization service' do
end end
end end
shared_examples 'unauthorized when external service denies access' do RSpec.shared_examples 'unauthorized when external service denies access' do
include ExternalAuthorizationServiceHelpers include ExternalAuthorizationServiceHelpers
it 'allows access when the authorization service allows it' do it 'allows access when the authorization service allows it' do
......
...@@ -10,7 +10,7 @@ def assign_session_token(provider) ...@@ -10,7 +10,7 @@ def assign_session_token(provider)
session[:"#{provider}_access_token"] = 'asdasd12345' session[:"#{provider}_access_token"] = 'asdasd12345'
end end
shared_examples 'a GitHub-ish import controller: POST personal_access_token' do RSpec.shared_examples 'a GitHub-ish import controller: POST personal_access_token' do
let(:status_import_url) { public_send("status_import_#{provider}_url") } let(:status_import_url) { public_send("status_import_#{provider}_url") }
it "updates access token" do it "updates access token" do
...@@ -38,7 +38,7 @@ shared_examples 'a GitHub-ish import controller: POST personal_access_token' do ...@@ -38,7 +38,7 @@ shared_examples 'a GitHub-ish import controller: POST personal_access_token' do
end end
end end
shared_examples 'a GitHub-ish import controller: GET new' do RSpec.shared_examples 'a GitHub-ish import controller: GET new' do
let(:status_import_url) { public_send("status_import_#{provider}_url") } let(:status_import_url) { public_send("status_import_#{provider}_url") }
it "redirects to status if we already have a token" do it "redirects to status if we already have a token" do
...@@ -57,7 +57,7 @@ shared_examples 'a GitHub-ish import controller: GET new' do ...@@ -57,7 +57,7 @@ shared_examples 'a GitHub-ish import controller: GET new' do
end end
end end
shared_examples 'a GitHub-ish import controller: GET status' do RSpec.shared_examples 'a GitHub-ish import controller: GET status' do
let(:new_import_url) { public_send("new_import_#{provider}_url") } let(:new_import_url) { public_send("new_import_#{provider}_url") }
let(:user) { create(:user) } let(:user) { create(:user) }
let(:repo) { OpenStruct.new(login: 'vim', full_name: 'asd/vim', name: 'vim', owner: { login: 'owner' }) } let(:repo) { OpenStruct.new(login: 'vim', full_name: 'asd/vim', name: 'vim', owner: { login: 'owner' }) }
...@@ -76,7 +76,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do ...@@ -76,7 +76,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do
get :status, format: :json get :status, format: :json
expect(response).to have_gitlab_http_status(200) expect(response).to have_gitlab_http_status(:ok)
expect(json_response.dig("imported_projects", 0, "id")).to eq(project.id) expect(json_response.dig("imported_projects", 0, "id")).to eq(project.id)
expect(json_response.dig("provider_repos", 0, "id")).to eq(repo.id) expect(json_response.dig("provider_repos", 0, "id")).to eq(repo.id)
expect(json_response.dig("provider_repos", 1, "id")).to eq(org_repo.id) expect(json_response.dig("provider_repos", 1, "id")).to eq(org_repo.id)
...@@ -107,7 +107,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do ...@@ -107,7 +107,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do
get :status get :status
expect(response).to have_gitlab_http_status(200) expect(response).to have_gitlab_http_status(:ok)
end end
it "handles an invalid access token" do it "handles an invalid access token" do
...@@ -153,7 +153,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do ...@@ -153,7 +153,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do
it 'filters list of repositories by name' do it 'filters list of repositories by name' do
get :status, params: { filter: 'emacs' }, format: :json get :status, params: { filter: 'emacs' }, format: :json
expect(response).to have_gitlab_http_status(200) expect(response).to have_gitlab_http_status(:ok)
expect(json_response.dig("imported_projects").count).to eq(0) expect(json_response.dig("imported_projects").count).to eq(0)
expect(json_response.dig("provider_repos").count).to eq(1) expect(json_response.dig("provider_repos").count).to eq(1)
expect(json_response.dig("provider_repos", 0, "id")).to eq(repo_2.id) expect(json_response.dig("provider_repos", 0, "id")).to eq(repo_2.id)
...@@ -173,7 +173,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do ...@@ -173,7 +173,7 @@ shared_examples 'a GitHub-ish import controller: GET status' do
end end
end end
shared_examples 'a GitHub-ish import controller: POST create' do RSpec.shared_examples 'a GitHub-ish import controller: POST create' do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:provider_username) { user.username } let(:provider_username) { user.username }
let(:provider_user) { OpenStruct.new(login: provider_username) } let(:provider_user) { OpenStruct.new(login: provider_username) }
...@@ -198,7 +198,7 @@ shared_examples 'a GitHub-ish import controller: POST create' do ...@@ -198,7 +198,7 @@ shared_examples 'a GitHub-ish import controller: POST create' do
post :create, format: :json post :create, format: :json
expect(response).to have_gitlab_http_status(200) expect(response).to have_gitlab_http_status(:ok)
end end
it 'returns 422 response with the base error when the project could not be imported' do it 'returns 422 response with the base error when the project could not be imported' do
...@@ -212,7 +212,7 @@ shared_examples 'a GitHub-ish import controller: POST create' do ...@@ -212,7 +212,7 @@ shared_examples 'a GitHub-ish import controller: POST create' do
post :create, format: :json post :create, format: :json
expect(response).to have_gitlab_http_status(422) expect(response).to have_gitlab_http_status(:unprocessable_entity)
expect(json_response['errors']).to eq('Name is invalid, Path is old') expect(json_response['errors']).to eq('Name is invalid, Path is old')
end end
...@@ -484,13 +484,13 @@ shared_examples 'a GitHub-ish import controller: POST create' do ...@@ -484,13 +484,13 @@ shared_examples 'a GitHub-ish import controller: POST create' do
post :create, params: { target_namespace: other_namespace.name }, format: :json post :create, params: { target_namespace: other_namespace.name }, format: :json
expect(response).to have_gitlab_http_status(422) expect(response).to have_gitlab_http_status(:unprocessable_entity)
end end
end end
end end
end end
shared_examples 'a GitHub-ish import controller: GET realtime_changes' do RSpec.shared_examples 'a GitHub-ish import controller: GET realtime_changes' do
let(:user) { create(:user) } let(:user) { create(:user) }
before do before do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'instance statistics availability' do RSpec.shared_examples 'instance statistics availability' do
let(:user) { create(:user) } let(:user) { create(:user) }
before do before do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'issuable notes filter' do RSpec.shared_examples 'issuable notes filter' do
let(:params) do let(:params) do
if issuable_parent.is_a?(Project) if issuable_parent.is_a?(Project)
{ namespace_id: issuable_parent.namespace, project_id: issuable_parent, id: issuable.iid } { namespace_id: issuable_parent.namespace, project_id: issuable_parent, id: issuable.iid }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'issuables list meta-data' do |issuable_type, action = nil| RSpec.shared_examples 'issuables list meta-data' do |issuable_type, action = nil|
include ProjectForksHelper include ProjectForksHelper
def get_action(action, project, extra_params = {}) def get_action(action, project, extra_params = {})
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'issuables requiring filter' do |action| RSpec.shared_examples 'issuables requiring filter' do |action|
it "doesn't load any issuables if no filter is set" do it "doesn't load any issuables if no filter is set" do
expect_any_instance_of(described_class).not_to receive(:issuables_collection) expect_any_instance_of(described_class).not_to receive(:issuables_collection)
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'milestone tabs' do RSpec.shared_examples 'milestone tabs' do
def go(path, extra_params = {}) def go(path, extra_params = {})
params = params =
case milestone case milestone
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper' RSpec.shared_examples 'paginated collection' do
shared_examples 'paginated collection' do
let(:collection) { nil } let(:collection) { nil }
let(:last_page) { collection.page.total_pages } let(:last_page) { collection.page.total_pages }
let(:action) { :index } let(:action) { :index }
......
...@@ -17,7 +17,7 @@ ...@@ -17,7 +17,7 @@
# it_behaves_like 'a controller that can serve LFS files', skip_lfs_disabled_tests: true do # it_behaves_like 'a controller that can serve LFS files', skip_lfs_disabled_tests: true do
# ... # ...
# end # end
shared_examples 'a controller that can serve LFS files' do |options = {}| RSpec.shared_examples 'a controller that can serve LFS files' do |options = {}|
let(:lfs_oid) { '91eff75a492a3ed0dfcb544d7f31326bc4014c8551849c192fd1e48d4dd2c897' } let(:lfs_oid) { '91eff75a492a3ed0dfcb544d7f31326bc4014c8551849c192fd1e48d4dd2c897' }
let(:lfs_size) { '1575078' } let(:lfs_size) { '1575078' }
let!(:lfs_object) { create(:lfs_object, oid: lfs_oid, size: lfs_size) } let!(:lfs_object) { create(:lfs_object, oid: lfs_oid, size: lfs_size) }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'authenticates sessionless user' do |path, format, params| RSpec.shared_examples 'authenticates sessionless user' do |path, format, params|
params ||= {} params ||= {}
before do before do
...@@ -20,14 +20,14 @@ shared_examples 'authenticates sessionless user' do |path, format, params| ...@@ -20,14 +20,14 @@ shared_examples 'authenticates sessionless user' do |path, format, params|
get path, params: default_params.merge(private_token: personal_access_token.token) get path, params: default_params.merge(private_token: personal_access_token.token)
expect(response).to have_gitlab_http_status(200) expect(response).to have_gitlab_http_status(:ok)
expect(controller.current_user).to eq(user) expect(controller.current_user).to eq(user)
end end
it 'does not log the user in if page is public', if: params[:public] do it 'does not log the user in if page is public', if: params[:public] do
get path, params: default_params get path, params: default_params
expect(response).to have_gitlab_http_status(200) expect(response).to have_gitlab_http_status(:ok)
expect(controller.current_user).to be_nil expect(controller.current_user).to be_nil
end end
end end
...@@ -48,7 +48,7 @@ shared_examples 'authenticates sessionless user' do |path, format, params| ...@@ -48,7 +48,7 @@ shared_examples 'authenticates sessionless user' do |path, format, params|
get path, params: default_params.merge(private_token: personal_access_token.token) get path, params: default_params.merge(private_token: personal_access_token.token)
expect(response).not_to have_gitlab_http_status(200) expect(response).not_to have_gitlab_http_status(:ok)
end end
end end
...@@ -62,7 +62,7 @@ shared_examples 'authenticates sessionless user' do |path, format, params| ...@@ -62,7 +62,7 @@ shared_examples 'authenticates sessionless user' do |path, format, params|
@request.headers['PRIVATE-TOKEN'] = personal_access_token.token @request.headers['PRIVATE-TOKEN'] = personal_access_token.token
get path, params: default_params get path, params: default_params
expect(response).to have_gitlab_http_status(200) expect(response).to have_gitlab_http_status(:ok)
end end
end end
...@@ -75,7 +75,7 @@ shared_examples 'authenticates sessionless user' do |path, format, params| ...@@ -75,7 +75,7 @@ shared_examples 'authenticates sessionless user' do |path, format, params|
get path, params: default_params.merge(feed_token: user.feed_token) get path, params: default_params.merge(feed_token: user.feed_token)
expect(response).to have_gitlab_http_status 200 expect(response).to have_gitlab_http_status(:ok)
end end
end end
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'set sort order from user preference' do RSpec.shared_examples 'set sort order from user preference' do
describe '#set_sort_order_from_user_preference' do describe '#set_sort_order_from_user_preference' do
# There is no sorting_field defined in any CE controllers yet, # There is no sorting_field defined in any CE controllers yet,
# however any other field present in user_preferences table can be used for testing. # however any other field present in user_preferences table can be used for testing.
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'todos actions' do RSpec.shared_examples 'todos actions' do
context 'when authorized' do context 'when authorized' do
before do before do
sign_in(user) sign_in(user)
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'a Trackable Controller' do RSpec.shared_examples 'a Trackable Controller' do
describe '#track_event' do describe '#track_event' do
before do before do
sign_in user sign_in user
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'update invalid issuable' do |klass| RSpec.shared_examples 'update invalid issuable' do |klass|
let(:params) do let(:params) do
{ {
namespace_id: project.namespace.path, namespace_id: project.namespace.path,
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'handle uploads' do RSpec.shared_examples 'handle uploads' do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:jpg) { fixture_file_upload('spec/fixtures/rails_sample.jpg', 'image/jpg') } let(:jpg) { fixture_file_upload('spec/fixtures/rails_sample.jpg', 'image/jpg') }
let(:txt) { fixture_file_upload('spec/fixtures/doc_sample.txt', 'text/plain') } let(:txt) { fixture_file_upload('spec/fixtures/doc_sample.txt', 'text/plain') }
...@@ -287,7 +287,7 @@ shared_examples 'handle uploads' do ...@@ -287,7 +287,7 @@ shared_examples 'handle uploads' do
end end
end end
shared_examples 'handle uploads authorize' do RSpec.shared_examples 'handle uploads authorize' do
describe "POST #authorize" do describe "POST #authorize" do
context 'when a user is not authorized to upload a file' do context 'when a user is not authorized to upload a file' do
it 'returns 404 status' do it 'returns 404 status' do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'GET #show lists all variables' do RSpec.shared_examples 'GET #show lists all variables' do
it 'renders the variables as json' do it 'renders the variables as json' do
subject subject
...@@ -14,7 +14,7 @@ shared_examples 'GET #show lists all variables' do ...@@ -14,7 +14,7 @@ shared_examples 'GET #show lists all variables' do
end end
end end
shared_examples 'PATCH #update updates variables' do RSpec.shared_examples 'PATCH #update updates variables' do
let(:variable_attributes) do let(:variable_attributes) do
{ id: variable.id, { id: variable.id,
key: variable.key, key: variable.key,
......
# frozen_string_literal: true
shared_examples_for 'correctly finds the mail key' do
specify do
expect(Gitlab::Email::Handler).to receive(:for).with(an_instance_of(Mail::Message), 'gitlabhq/gitlabhq+auth_token').and_return(handler)
receiver.execute
end
end
# frozen_string_literal: true
shared_examples 'updated exposed field' do
it 'creates another Evidence object' do
model.send("#{updated_field}=", updated_value)
expect(model.evidence_summary_keys).to include(updated_field)
expect { model.save! }.to change(Evidence, :count).by(1)
expect(updated_json_field).to eq(updated_value)
end
end
shared_examples 'updated non-exposed field' do
it 'does not create any Evidence object' do
model.send("#{updated_field}=", updated_value)
expect(model.evidence_summary_keys).not_to include(updated_field)
expect { model.save! }.not_to change(Evidence, :count)
end
end
shared_examples 'updated field on non-linked entity' do
it 'does not create any Evidence object' do
model.send("#{updated_field}=", updated_value)
expect(model.evidence_summary_keys).to be_empty
expect { model.save! }.not_to change(Evidence, :count)
end
end
# frozen_string_literal: true
shared_examples_for 'fast destroyable' do
describe 'Forbid #destroy and #destroy_all' do
it 'does not delete database rows and associted external data' do
expect(external_data_counter).to be > 0
expect(subjects.count).to be > 0
expect { subjects.first.destroy }.to raise_error('`destroy` and `destroy_all` are forbidden. Please use `fast_destroy_all`')
expect { subjects.destroy_all }.to raise_error('`destroy` and `destroy_all` are forbidden. Please use `fast_destroy_all`') # rubocop: disable DestroyAll
expect(subjects.count).to be > 0
expect(external_data_counter).to be > 0
end
end
describe '.fast_destroy_all' do
it 'deletes database rows and associted external data' do
expect(external_data_counter).to be > 0
expect(subjects.count).to be > 0
expect { subjects.fast_destroy_all }.not_to raise_error
expect(subjects.count).to eq(0)
expect(external_data_counter).to eq(0)
end
end
describe '.use_fast_destroy' do
it 'performs cascading delete with fast_destroy_all' do
expect(external_data_counter).to be > 0
expect(subjects.count).to be > 0
expect { parent.destroy }.not_to raise_error
expect(subjects.count).to eq(0)
expect(external_data_counter).to eq(0)
end
end
end
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'archive download buttons' do RSpec.shared_examples 'archive download buttons' do
let(:path_to_visit) { project_path(project) } let(:path_to_visit) { project_path(project) }
let(:ref) { project.default_branch } let(:ref) { project.default_branch }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'comment on merge request file' do RSpec.shared_examples 'comment on merge request file' do
it 'adds a comment' do it 'adds a comment' do
click_diff_line(find("[id='#{sample_commit.line_code}']")) click_diff_line(find("[id='#{sample_commit.line_code}']"))
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'dirty submit form' do |selector_args| RSpec.shared_examples 'dirty submit form' do |selector_args|
selectors = selector_args.is_a?(Array) ? selector_args : [selector_args] selectors = selector_args.is_a?(Array) ? selector_args : [selector_args]
def expect_disabled_state(form, submit_selector, is_disabled = true) def expect_disabled_state(form, submit_selector, is_disabled = true)
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'thread comments' do |resource_name| RSpec.shared_examples 'thread comments' do |resource_name|
let(:form_selector) { '.js-main-target-form' } let(:form_selector) { '.js-main-target-form' }
let(:dropdown_selector) { "#{form_selector} .comment-type-dropdown" } let(:dropdown_selector) { "#{form_selector} .comment-type-dropdown" }
let(:toggle_selector) { "#{dropdown_selector} .dropdown-toggle" } let(:toggle_selector) { "#{dropdown_selector} .dropdown-toggle" }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'issue sidebar stays collapsed on mobile' do RSpec.shared_examples 'issue sidebar stays collapsed on mobile' do
before do before do
resize_screen_xs resize_screen_xs
end end
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'issuable user dropdown behaviors' do RSpec.shared_examples 'issuable user dropdown behaviors' do
include FilteredSearchHelpers include FilteredSearchHelpers
before do before do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'multiple assignees merge request' do |action, save_button_title| RSpec.shared_examples 'multiple assignees merge request' do |action, save_button_title|
it "#{action} a MR with multiple assignees", :js do it "#{action} a MR with multiple assignees", :js do
find('.js-assignee-search').click find('.js-assignee-search').click
page.within '.dropdown-menu-user' do page.within '.dropdown-menu-user' do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'project features apply to issuables' do |klass| RSpec.shared_examples 'project features apply to issuables' do |klass|
let(:described_class) { klass } let(:described_class) { klass }
let(:group) { create(:group) } let(:group) { create(:group) }
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'shows public projects' do RSpec.shared_examples 'shows public projects' do
it 'shows projects' do it 'shows projects' do
expect(page).to have_content(public_project.title) expect(page).to have_content(public_project.title)
expect(page).not_to have_content(internal_project.title) expect(page).not_to have_content(internal_project.title)
...@@ -9,7 +9,7 @@ shared_examples 'shows public projects' do ...@@ -9,7 +9,7 @@ shared_examples 'shows public projects' do
end end
end end
shared_examples 'shows public and internal projects' do RSpec.shared_examples 'shows public and internal projects' do
it 'shows projects' do it 'shows projects' do
expect(page).to have_content(public_project.title) expect(page).to have_content(public_project.title)
expect(page).to have_content(internal_project.title) expect(page).to have_content(internal_project.title)
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples "protected branches > access control > CE" do RSpec.shared_examples "protected branches > access control > CE" do
ProtectedRefAccess::HUMAN_ACCESS_LEVELS.each do |(access_type_id, access_type_name)| ProtectedRefAccess::HUMAN_ACCESS_LEVELS.each do |(access_type_id, access_type_name)|
it "allows creating protected branches that #{access_type_name} can push to" do it "allows creating protected branches that #{access_type_name} can push to" do
visit project_protected_branches_path(project) visit project_protected_branches_path(project)
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper' RSpec.shared_examples 'reportable note' do |type|
shared_examples 'reportable note' do |type|
include MobileHelpers include MobileHelpers
include NotesHelper include NotesHelper
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'creating an issue for a thread' do RSpec.shared_examples 'creating an issue for a thread' do
it 'shows an issue with the title filled in' do it 'shows an issue with the title filled in' do
title_field = page.find_field('issue[title]') title_field = page.find_field('issue[title]')
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples "an autodiscoverable RSS feed with current_user's feed token" do RSpec.shared_examples "an autodiscoverable RSS feed with current_user's feed token" do
it "has an RSS autodiscovery link tag with current_user's feed token" do it "has an RSS autodiscovery link tag with current_user's feed token" do
expect(page).to have_css("link[type*='atom+xml'][href*='feed_token=#{user.feed_token}']", visible: false) expect(page).to have_css("link[type*='atom+xml'][href*='feed_token=#{user.feed_token}']", visible: false)
end end
end end
shared_examples "it has an RSS button with current_user's feed token" do RSpec.shared_examples "it has an RSS button with current_user's feed token" do
it "shows the RSS button with current_user's feed token" do it "shows the RSS button with current_user's feed token" do
expect(page) expect(page)
.to have_css("a:has(.fa-rss)[href*='feed_token=#{user.feed_token}']") .to have_css("a:has(.fa-rss)[href*='feed_token=#{user.feed_token}']")
...@@ -14,13 +14,13 @@ shared_examples "it has an RSS button with current_user's feed token" do ...@@ -14,13 +14,13 @@ shared_examples "it has an RSS button with current_user's feed token" do
end end
end end
shared_examples "an autodiscoverable RSS feed without a feed token" do RSpec.shared_examples "an autodiscoverable RSS feed without a feed token" do
it "has an RSS autodiscovery link tag without a feed token" do it "has an RSS autodiscovery link tag without a feed token" do
expect(page).to have_css("link[type*='atom+xml']:not([href*='feed_token'])", visible: false) expect(page).to have_css("link[type*='atom+xml']:not([href*='feed_token'])", visible: false)
end end
end end
shared_examples "it has an RSS button without a feed token" do RSpec.shared_examples "it has an RSS button without a feed token" do
it "shows the RSS button without a feed token" do it "shows the RSS button without a feed token" do
expect(page) expect(page)
.to have_css("a:has(.fa-rss):not([href*='feed_token'])") .to have_css("a:has(.fa-rss):not([href*='feed_token'])")
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'top right search form' do RSpec.shared_examples 'top right search form' do
it 'does not show top right search form' do it 'does not show top right search form' do
expect(page).not_to have_selector('.search') expect(page).not_to have_selector('.search')
end end
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'showing user status' do RSpec.shared_examples 'showing user status' do
let!(:status) { create(:user_status, user: user_with_status, emoji: 'smirk', message: 'Authoring this object') } let!(:status) { create(:user_status, user: user_with_status, emoji: 'smirk', message: 'Authoring this object') }
it 'shows the status' do it 'shows the status' do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'variable list' do RSpec.shared_examples 'variable list' do
it 'shows list of variables' do it 'shows list of variables' do
page.within('.js-ci-variable-list-section') do page.within('.js-ci-variable-list-section') do
expect(first('.js-ci-variable-input-key').value).to eq(variable.key) expect(first('.js-ci-variable-input-key').value).to eq(variable.key)
......
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
# Requires a context containing: # Requires a context containing:
# project # project
shared_examples 'wiki file attachments' do RSpec.shared_examples 'wiki file attachments' do
include DropzoneHelper include DropzoneHelper
context 'uploading attachments', :js do context 'uploading attachments', :js do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'assignee ID filter' do RSpec.shared_examples 'assignee ID filter' do
it 'returns issuables assigned to that user' do it 'returns issuables assigned to that user' do
expect(issuables).to contain_exactly(*expected_issuables) expect(issuables).to contain_exactly(*expected_issuables)
end end
end end
shared_examples 'assignee NOT ID filter' do RSpec.shared_examples 'assignee NOT ID filter' do
it 'returns issuables not assigned to that user' do it 'returns issuables not assigned to that user' do
expect(issuables).to contain_exactly(*expected_issuables) expect(issuables).to contain_exactly(*expected_issuables)
end end
end end
shared_examples 'assignee username filter' do RSpec.shared_examples 'assignee username filter' do
it 'returns issuables assigned to those users' do it 'returns issuables assigned to those users' do
expect(issuables).to contain_exactly(*expected_issuables) expect(issuables).to contain_exactly(*expected_issuables)
end end
end end
shared_examples 'assignee NOT username filter' do RSpec.shared_examples 'assignee NOT username filter' do
it 'returns issuables not assigned to those users' do it 'returns issuables not assigned to those users' do
expect(issuables).to contain_exactly(*expected_issuables) expect(issuables).to contain_exactly(*expected_issuables)
end end
end end
shared_examples 'no assignee filter' do RSpec.shared_examples 'no assignee filter' do
let(:params) { { assignee_id: 'None' } } let(:params) { { assignee_id: 'None' } }
it 'returns issuables not assigned to any assignee' do it 'returns issuables not assigned to any assignee' do
...@@ -38,7 +38,7 @@ shared_examples 'no assignee filter' do ...@@ -38,7 +38,7 @@ shared_examples 'no assignee filter' do
end end
end end
shared_examples 'any assignee filter' do RSpec.shared_examples 'any assignee filter' do
context '' do context '' do
let(:params) { { assignee_id: 'Any' } } let(:params) { { assignee_id: 'Any' } }
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper' RSpec.shared_examples 'a finder with external authorization service' do
shared_examples 'a finder with external authorization service' do
include ExternalAuthorizationServiceHelpers include ExternalAuthorizationServiceHelpers
let(:user) { create(:user) } let(:user) { create(:user) }
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper'
# Shared example for legal queries that are expected to return nil. # Shared example for legal queries that are expected to return nil.
# Requires the following let bindings to be defined: # Requires the following let bindings to be defined:
# - post_query: action to send the query # - post_query: action to send the query
# - path: array of keys from query root to the result # - path: array of keys from query root to the result
shared_examples 'a failure to find anything' do RSpec.shared_examples 'a failure to find anything' do
it 'finds nothing' do it 'finds nothing' do
post_query post_query
......
# frozen_string_literal: true # frozen_string_literal: true
require 'spec_helper'
shared_context 'exposing regular notes on a noteable in GraphQL' do RSpec.shared_context 'exposing regular notes on a noteable in GraphQL' do
include GraphqlHelpers include GraphqlHelpers
let(:note) do let(:note) do
......
# frozen_string_literal: true # frozen_string_literal: true
shared_examples 'a request using Gitlab::UrlBlocker' do RSpec.shared_examples 'a request using Gitlab::UrlBlocker' do
# Written to test internal patches against 3rd party libraries # Written to test internal patches against 3rd party libraries
# #
# Expects the following to be available in the example contexts: # Expects the following to be available in the example contexts:
......
# frozen_string_literal: true
shared_examples 'redirecting a legacy path' do |source, target|
include RSpec::Rails::RequestExampleGroup
it "redirects #{source} to #{target} when the resource does not exist" do
expect(get(source)).to redirect_to(target)
end
it "does not redirect #{source} to #{target} when the resource exists" do
resource
expect(get(source)).not_to redirect_to(target)
end
end
shared_examples 'redirecting a legacy project path' do |source, target|
include RSpec::Rails::RequestExampleGroup
it "redirects #{source} to #{target}" do
expect(get(source)).to redirect_to(target)
end
end
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment