Commit 74b5d4bc authored by GitLab Bot's avatar GitLab Bot

Automatic merge of gitlab-org/gitlab master

parents 9187f544 157d1a60
......@@ -39,7 +39,7 @@ class Projects::PipelinesController < Projects::ApplicationController
.new(project, current_user, index_params)
.execute
.page(params[:page])
.per(30)
.per(20)
@pipelines_count = limited_pipelines_count(project)
......
......@@ -7,13 +7,12 @@ type: reference
# .gitignore API
In GitLab, there is an API endpoint available for `.gitignore`. For more
information on `gitignore`, see the
[Git documentation](https://git-scm.com/docs/gitignore).
In GitLab, the `/gitignores` endpoint returns a list of Git `.gitignore` templates. For more information,
see the [Git documentation for `.gitignore`](https://git-scm.com/docs/gitignore).
## List `.gitignore` templates
## Get all `.gitignore` templates
Get all `.gitignore` templates.
Get a list of all `.gitignore` templates:
```plaintext
GET /templates/gitignores
......@@ -112,9 +111,9 @@ Example response:
]
```
## Single `.gitignore` template
## Get a single `.gitignore` template
Get a single `.gitignore` template.
Get a single `.gitignore` template:
```plaintext
GET /templates/gitignores/:key
......
......@@ -969,7 +969,7 @@ included in the generated anchor links. For example, when you link to
Keep in mind that the GitLab user interface links to many documentation pages
and anchor links to take the user to the right spot. When you change
a heading, search `doc/*`, `app/views/*`, and `ee/app/views/*` for the old
anchor. If you do not fix these links, the [`ui-docs-lint` job](../testing.md#ui-docs-links-test)
anchor. If you do not fix these links, the [`ui-docs-lint` job](../testing.md#ui-link-tests)
in your merge request fails.
Important:
......
......@@ -7,34 +7,40 @@ description: Learn how to contribute to GitLab Documentation.
# Documentation testing
We treat documentation as code, and so use tests in our CI pipeline to maintain the
standards and quality of the docs. The current tests, which run in CI jobs when a
merge request with new or changed docs is submitted, are:
- [`docs lint`](https://gitlab.com/gitlab-org/gitlab/-/blob/0b562014f7b71f98540e682c8d662275f0011f2f/.gitlab/ci/docs.gitlab-ci.yml#L41):
Runs several tests on the content of the docs themselves:
- [`lint-doc.sh` script](https://gitlab.com/gitlab-org/gitlab/blob/master/scripts/lint-doc.sh)
runs the following checks and linters:
- All cURL examples use the long flags (ex: `--header`, not `-H`).
- The `CHANGELOG.md` does not contain duplicate versions.
- No files in `doc/` are executable.
- No new `README.md` was added.
- [markdownlint](#markdownlint).
- [Vale](#vale).
- Nanoc tests:
- [`internal_links`](https://gitlab.com/gitlab-org/gitlab/-/blob/0b562014f7b71f98540e682c8d662275f0011f2f/.gitlab/ci/docs.gitlab-ci.yml#L58)
checks that all internal links (ex: `[link](../index.md)`) are valid.
- [`internal_anchors`](https://gitlab.com/gitlab-org/gitlab/-/blob/0b562014f7b71f98540e682c8d662275f0011f2f/.gitlab/ci/docs.gitlab-ci.yml#L60)
checks that all internal anchors (ex: `[link](../index.md#internal_anchor)`)
are valid.
- [`ui-docs-links lint`](https://gitlab.com/gitlab-org/gitlab/-/blob/0b562014f7b71f98540e682c8d662275f0011f2f/.gitlab/ci/docs.gitlab-ci.yml#L62)
checks that all links to docs from UI elements (`app/views` files, for example)
are linking to valid docs and anchors.
GitLab documentation is stored in projects with code and treated like code. Therefore, we use
processes similar to those used for code to maintain standards and quality of documentation.
We have tests:
- To lint the words and structure of the documentation.
- To check the validity of internal links within the documentation suite.
- To check the validity of links from UI elements, such as files in `app/views` files.
For the specifics of each test run in our CI/CD pipelines, see the configuration for those tests
in the relevant projects:
- <https://gitlab.com/gitlab-org/gitlab/-/blob/master/.gitlab/ci/docs.gitlab-ci.yml>
- <https://gitlab.com/gitlab-org/gitlab-runner/-/blob/master/.gitlab/ci/docs.gitlab-ci.yml>
- <https://gitlab.com/gitlab-org/omnibus-gitlab/-/blob/master/gitlab-ci-config/gitlab-com.yml>
- <https://gitlab.com/gitlab-org/charts/gitlab/-/blob/master/.gitlab-ci.yml>
## Run tests locally
Apart from [previewing your changes locally](index.md#previewing-the-changes-live), you can also run all lint checks
and Nanoc tests locally.
Similar to [previewing your changes locally](index.md#previewing-the-changes-live), you can also
run these tests on your local computer. This has the advantage of:
- Speeding up the feedback loop. You can know of any problems with the changes in your branch
without waiting for a CI/CD pipeline to run.
- Lowering costs. Running tests locally is cheaper than running tests on GitLab's cloud
infrastructure.
To run tests locally, it's important to:
- [Install the tools](#install-linters), and [keep them up to date](#update-linters).
- Run [linters](#lint-checks), [documentation link tests](#documentation-link-tests), and
[UI link tests](#ui-link-tests) the same way they are run in CI/CD pipelines. It's important to use
same configuration we use in CI/CD pipelines, which can be different than the default configuration
of the tool.
### Lint checks
......@@ -66,15 +72,15 @@ The output should be similar to:
This requires you to either:
- Have the required lint tools installed on your machine.
- Have the [required lint tools installed](#local-linters) on your computer.
- A working Docker installation, in which case an image with these tools pre-installed is used.
### Nanoc tests
### Documentation link tests
To execute Nanoc tests locally:
To execute documentation link tests locally:
1. Navigate to the [`gitlab-docs`](https://gitlab.com/gitlab-org/gitlab-docs) directory.
1. Run:
1. Run the following commands:
```shell
# Check for broken internal links
......@@ -85,7 +91,7 @@ To execute Nanoc tests locally:
bundle exec nanoc check internal_anchors
```
### `ui-docs-links` test
### UI link tests
The `ui-docs-links lint` job uses `haml-lint` to test that all links to docs from
UI elements (`app/views` files, for example) are linking to valid docs and anchors.
......@@ -191,22 +197,15 @@ You can use Vale:
At a minimum, install [markdownlint](#markdownlint) and [Vale](#vale) to match the checks run in
build pipelines:
1. Install `markdownlint-cli`, using either:
- `npm`:
1. Install `markdownlint-cli`:
```shell
npm install -g markdownlint-cli
```
- `yarn`:
```shell
yarn global add markdownlint-cli
```
```shell
yarn global add markdownlint-cli
```
We recommend installing the version of `markdownlint-cli` currently used in the documentation
linting [Docker image](https://gitlab.com/gitlab-org/gitlab-docs/-/blob/master/.gitlab-ci.yml#L420).
We recommend installing the version of `markdownlint-cli`
[used](https://gitlab.com/gitlab-org/gitlab-docs/-/blob/master/.gitlab-ci.yml#L447) when building
the `image:docs-lint-markdown`.
1. Install [`vale`](https://github.com/errata-ai/vale/releases). For example, to install using
`brew` for macOS, run:
......@@ -215,14 +214,29 @@ build pipelines:
brew install vale
```
We recommend installing the version of Vale currently used in the documentation linting
[Docker image](https://gitlab.com/gitlab-org/gitlab-docs/-/blob/master/.gitlab-ci.yml#L419).
These tools can be [integrated with your code editor](#configure-editors).
In addition to using markdownlint and Vale at the command line, these tools can be
[integrated with your code editor](#configure-editors).
### Update linters
It's important to use linter versions that are the same or newer than those run in
CI/CD. This provides access to new features and possible bug fixes.
To match the versions of `markdownlint-cli` and `vale` used in the GitLab projects, refer to the
[versions used](https://gitlab.com/gitlab-org/gitlab-docs/-/blob/master/.gitlab-ci.yml#L447)
when building the `image:docs-lint-markdown` Docker image containing these tools for CI/CD.
| Tool | Version | Command | Additional info |
|--------------------|----------|-------------------------------------------|-----------------|
| `markdownlint-cli` | Latest | `yarn global add markdownlint-cli` | n/a |
| `markdownlint-cli` | Specfic | `yarn global add markdownlint-cli@0.23.2` | The `@` indicates a specific version, and this example updates the tool to version `0.23.2`. |
| Vale | Latest | `brew update && brew upgrade vale` | This command is for macOS only. |
| Vale | Specific | n/a | Not possible using `brew`, but can be [directly downloaded](https://github.com/errata-ai/vale/releases). |
### Configure editors
Using linters in your editor is more convenient than having to run the commands from the
command line.
To configure markdownlint within your editor, install one of the following as appropriate:
- [Sublime Text](https://packagecontrol.io/packages/SublimeLinter-contrib-markdownlint)
......
......@@ -71,6 +71,10 @@ To learn how to add an issue to an iteration, see the steps in
You can track the progress of an iteration by reviewing iteration reports.
An iteration report displays a list of all the issues assigned to an iteration and their status.
The report also shows a breakdown of total issues in an iteration.
Open iteration reports show a summary of completed, unstarted, and in-progress issues.
Closed iteration reports show the total number of issues completed by the due date.
To view an iteration report, go to the iterations list page and click an iteration's title.
### Iteration burndown and burnup charts
......
......@@ -3,17 +3,19 @@
import {
GlAlert,
GlBadge,
GlLoadingIcon,
GlEmptyState,
GlIcon,
GlDropdown,
GlDropdownItem,
GlEmptyState,
GlIcon,
GlLoadingIcon,
} from '@gitlab/ui';
import BurnCharts from 'ee/burndown_chart/components/burn_charts.vue';
import { formatDate } from '~/lib/utils/datetime_utility';
import glFeatureFlagsMixin from '~/vue_shared/mixins/gl_feature_flags_mixin';
import { __ } from '~/locale';
import IterationReportSummary from './iteration_report_summary.vue';
import IterationReportSummaryCards from './iteration_report_summary_cards.vue';
import IterationReportSummaryClosed from './iteration_report_summary_closed.vue';
import IterationReportSummaryOpen from './iteration_report_summary_open.vue';
import IterationForm from './iteration_form.vue';
import IterationReportTabs from './iteration_report_tabs.vue';
import query from '../queries/iteration.query.graphql';
......@@ -35,13 +37,15 @@ export default {
BurnCharts,
GlAlert,
GlBadge,
GlLoadingIcon,
GlEmptyState,
GlIcon,
GlDropdown,
GlDropdownItem,
GlEmptyState,
GlLoadingIcon,
IterationForm,
IterationReportSummary,
IterationReportSummaryCards,
IterationReportSummaryClosed,
IterationReportSummaryOpen,
IterationReportTabs,
},
apollo: {
......@@ -113,8 +117,11 @@ export default {
canEditIteration() {
return this.canEdit && this.namespaceType === Namespace.Group;
},
hasIteration() {
return !this.$apollo.queries.iteration.loading && this.iteration?.title;
loading() {
return this.$apollo.queries.iteration.loading;
},
showEmptyState() {
return !this.loading && this.iteration && !this.iteration.title;
},
status() {
switch (this.iteration.state) {
......@@ -131,6 +138,11 @@ export default {
return { text: __('Open'), variant: 'success' };
}
},
summaryComponent() {
return this.iteration.state === 'closed'
? IterationReportSummaryClosed
: IterationReportSummaryOpen;
},
},
mounted() {
this.boundOnPopState = this.onPopState.bind(this);
......@@ -171,9 +183,9 @@ export default {
<gl-alert v-if="error" variant="danger" @dismiss="error = ''">
{{ error }}
</gl-alert>
<gl-loading-icon v-if="$apollo.queries.iteration.loading" class="gl-py-5" size="lg" />
<gl-loading-icon v-else-if="loading" class="gl-py-5" size="lg" />
<gl-empty-state
v-else-if="!hasIteration"
v-else-if="showEmptyState"
:title="__('Could not find iteration')"
:compact="false"
/>
......@@ -214,11 +226,27 @@ export default {
</div>
<h3 ref="title" class="page-title">{{ iteration.title }}</h3>
<div ref="description" v-html="iteration.descriptionHtml"></div>
<iteration-report-summary
<component
:is="summaryComponent"
:full-path="fullPath"
:iteration-id="iteration.id"
:namespace-type="namespaceType"
>
<iteration-report-summary-cards
slot-scope="{ columns, loading: summaryLoading, total }"
:columns="columns"
:loading="summaryLoading"
:total="total"
/>
</component>
<!-- <iteration-report-summary-closed
v-if="iteration.state === 'closed'"
:iteration-id="iteration.id"
/>
<iteration-report-summary-open
v-else
/> -->
<burn-charts
:start-date="iteration.startDate"
:due-date="iteration.dueDate"
......
<script>
import { GlCard, GlSkeletonLoader, GlSprintf } from '@gitlab/ui';
export default {
cardBodyClass: 'gl-text-center gl-py-3 gl-font-size-h2',
cardClass: 'gl-bg-gray-10 gl-border-0 gl-mb-5',
components: {
GlCard,
GlSkeletonLoader,
GlSprintf,
},
props: {
columns: {
type: Array,
required: false,
default: () => [],
},
loading: {
type: Boolean,
required: true,
},
total: {
type: Number,
required: true,
},
},
methods: {
percent(val) {
if (!this.total) return 0;
return ((val / this.total) * 100).toFixed(0);
},
},
};
</script>
<template>
<div class="row gl-mt-6">
<div v-for="(column, index) in columns" :key="index" class="col-sm-4">
<gl-card :class="$options.cardClass" :body-class="$options.cardBodyClass">
<gl-skeleton-loader v-if="loading" :width="400" :height="24">
<rect x="100" y="4" width="120" height="20" rx="4" />
<rect x="200" y="4" width="86" height="20" rx="4" />
</gl-skeleton-loader>
<div v-else>
<span class="gl-border-1 gl-border-r-solid gl-border-gray-100 gl-pr-3 gl-mr-2">
{{ column.title }}
<span class="gl-font-weight-bold"
>{{ percent(column.value) }}<small class="gl-text-gray-500">%</small></span
>
</span>
<gl-sprintf :message="__('%{count} of %{total}')">
<template #count>
<span class="gl-font-weight-bold">{{ column.value }}</span>
</template>
<template #total>
<span class="gl-font-weight-bold">{{ total }}</span>
</template>
</gl-sprintf>
</div>
</gl-card>
</div>
</div>
</template>
<script>
import { __ } from '~/locale';
import { fetchPolicies } from '~/lib/graphql';
import IterationReportSummaryCards from './iteration_report_summary_cards.vue';
import summaryStatsQuery from '../queries/iteration_issues_summary_stats.query.graphql';
export default {
components: {
IterationReportSummaryCards,
},
apollo: {
issues: {
fetchPolicy: fetchPolicies.NO_CACHE,
query: summaryStatsQuery,
variables() {
return this.queryVariables;
},
update(data) {
const stats = data.iteration?.report?.stats || {};
return {
complete: stats.complete?.count || 0,
incomplete: stats.incomplete?.count || 0,
total: stats.total?.count || 0,
};
},
},
},
props: {
iterationId: {
type: String,
required: true,
},
},
data() {
return {
issues: {
complete: 0,
incomplete: 0,
total: 0,
},
};
},
computed: {
queryVariables() {
return {
id: this.iterationId,
};
},
columns() {
return [
{
title: __('Completed'),
value: this.issues.complete,
},
{
title: __('Incomplete'),
value: this.issues.incomplete,
},
];
},
},
render() {
return this.$scopedSlots.default({
columns: this.columns,
loading: this.$apollo.queries.issues.loading,
total: this.issues.total,
});
},
};
</script>
<script>
import { GlCard, GlIcon } from '@gitlab/ui';
import { __ } from '~/locale';
import { getIdFromGraphQLId } from '~/graphql_shared/utils';
import query from '../queries/iteration_issues_summary.query.graphql';
import IterationReportSummaryCards from './iteration_report_summary_cards.vue';
import summaryStatsQuery from '../queries/iteration_issues_summary.query.graphql';
import { Namespace } from '../constants';
export default {
cardBodyClass: 'gl-text-center gl-py-3',
cardClass: 'gl-bg-gray-10 gl-border-0',
components: {
GlCard,
GlIcon,
IterationReportSummaryCards,
},
apollo: {
issues: {
query,
query: summaryStatsQuery,
variables() {
return this.queryVariables;
},
......@@ -48,7 +45,11 @@ export default {
},
data() {
return {
issues: {},
issues: {
assigned: 0,
open: 0,
closed: 0,
},
};
},
computed: {
......@@ -67,44 +68,38 @@ export default {
}
return ((closed / (open + closed)) * 100).toFixed(0);
},
showCards() {
return !this.$apollo.queries.issues.loading && Object.values(this.issues).every(a => a >= 0);
},
columns() {
return [
{
title: __('Complete'),
value: `${this.completedPercent}%`,
},
{
title: __('Open'),
value: this.issues.open,
icon: true,
title: __('Completed'),
value: this.issues.closed,
},
{
title: __('In progress'),
title: __('Incomplete'),
value: this.issues.assigned,
icon: true,
},
{
title: __('Completed'),
value: this.issues.closed,
icon: true,
title: __('Unstarted'),
value: this.issues.open,
},
];
},
total() {
return this.issues.open + this.issues.assigned + this.issues.closed;
},
},
methods: {
percent(val) {
if (!this.total) return 0;
return ((val / this.total) * 100).toFixed(0);
},
},
render() {
return this.$scopedSlots.default({
columns: this.columns,
loading: this.$apollo.queries.issues.loading,
total: this.total,
});
},
};
</script>
<template>
<div v-if="showCards" class="row gl-mt-6">
<div v-for="(column, index) in columns" :key="index" class="col-sm-3">
<gl-card :class="$options.cardClass" :body-class="$options.cardBodyClass" class="gl-mb-5">
<span>{{ column.title }}</span>
<span class="gl-font-size-h2 gl-font-weight-bold">{{ column.value }}</span>
<gl-icon v-if="column.icon" name="issues" :size="12" class="gl-text-gray-500" />
</gl-card>
</div>
</div>
</template>
......@@ -2,3 +2,11 @@ export const Namespace = {
Group: 'group',
Project: 'project',
};
export const iterationStates = {
closed: 'closed',
upcoming: 'upcoming',
expired: 'expired',
};
export default {};
......@@ -9,7 +9,12 @@ import IterationReport from './components/iteration_report.vue';
Vue.use(VueApollo);
const apolloProvider = new VueApollo({
defaultClient: createDefaultClient(),
defaultClient: createDefaultClient(
{},
{
batchMax: 1,
},
),
});
export function initIterationsList(namespaceType) {
......
......@@ -7,6 +7,7 @@ query IterationIssuesSummary($fullPath: ID!, $id: ID!, $isGroup: Boolean = true)
includeSubgroups: true
) {
count
weight
}
assignedIssues: issues(
iterationId: [$id]
......@@ -15,20 +16,25 @@ query IterationIssuesSummary($fullPath: ID!, $id: ID!, $isGroup: Boolean = true)
includeSubgroups: true
) {
count
weight
}
closedIssues: issues(iterationId: [$id], state: closed, includeSubgroups: true) {
count
weight
}
}
project(fullPath: $fullPath) @skip(if: $isGroup) {
openIssues: issues(iterationId: [$id], state: opened, assigneeId: "none") {
count
weight
}
assignedIssues: issues(iterationId: [$id], state: opened, assigneeId: "any") {
count
weight
}
closedIssues: issues(iterationId: [$id], state: closed) {
count
weight
}
}
}
query IterationIssuesSummaryStats($id: ID!) {
iteration(id: $id) {
report {
stats {
total {
weight
count
}
complete {
weight
count
}
incomplete {
weight
count
}
}
}
}
}
---
title: Fixed summary info for closed iterations
merge_request: 47879
author:
type: changed
......@@ -40,10 +40,9 @@ RSpec.describe 'User views iteration' do
end
aggregate_failures 'expect summary information' do
expect(page).to have_content("Complete 25%")
expect(page).to have_content("Open 2")
expect(page).to have_content("In progress 1")
expect(page).to have_content("Completed 1")
expect(page).to have_content("Completed")
expect(page).to have_content("Incomplete")
expect(page).to have_content("Unstarted")
end
aggregate_failures 'expect burnup and burndown charts' do
......
......@@ -35,10 +35,8 @@ RSpec.describe 'User views iteration' do
end
aggregate_failures 'shows correct summary information' do
expect(page).to have_content("Complete 50%")
expect(page).to have_content("Open 1")
expect(page).to have_content("In progress 0")
expect(page).to have_content("Completed 1")
expect(page).to have_content("Completed")
expect(page).to have_content("Incomplete")
end
aggregate_failures 'expect burnup and burndown charts' do
......
import { GlDropdown, GlDropdownItem, GlEmptyState, GlLoadingIcon, GlTab, GlTabs } from '@gitlab/ui';
import { shallowMount } from '@vue/test-utils';
import IterationForm from 'ee/iterations/components/iteration_form.vue';
import IterationReportSummaryOpen from 'ee/iterations/components/iteration_report_summary_open.vue';
import IterationReportSummaryClosed from 'ee/iterations/components/iteration_report_summary_closed.vue';
import IterationReport from 'ee/iterations/components/iteration_report.vue';
import IterationReportSummary from 'ee/iterations/components/iteration_report_summary.vue';
import IterationReportTabs from 'ee/iterations/components/iteration_report_tabs.vue';
import { Namespace } from 'ee/iterations/constants';
......@@ -72,6 +73,7 @@ describe('Iterations report', () => {
descriptionHtml: 'The first week of June',
startDate: '2020-06-02',
dueDate: '2020-06-08',
state: 'opened',
};
describe('user without edit permission', () => {
......@@ -104,6 +106,37 @@ describe('Iterations report', () => {
it('hides actions dropdown', () => {
expect(findActionsDropdown().exists()).toBe(false);
});
it('renders IterationReportSummaryOpen for open iteration', () => {
expect(wrapper.find(IterationReportSummaryOpen).props()).toEqual({
iterationId: iteration.id,
namespaceType: Namespace.Group,
fullPath: defaultProps.fullPath,
});
});
it('renders IterationReportSummaryClosed for closed iteration', async () => {
await wrapper.setData({
iteration: {
...iteration,
state: 'closed',
},
});
expect(wrapper.find(IterationReportSummaryClosed).props()).toEqual({
iterationId: iteration.id,
});
});
it('shows IterationReportTabs component', () => {
const iterationReportTabs = wrapper.find(IterationReportTabs);
expect(iterationReportTabs.props()).toEqual({
fullPath: defaultProps.fullPath,
iterationId: iteration.id,
namespaceType: Namespace.Group,
});
});
});
describe('user with edit permission', () => {
......@@ -135,22 +168,6 @@ describe('Iterations report', () => {
'/edit',
);
});
it('passes correct props to IterationReportSummary', () => {
const iterationReportSummary = wrapper.find(IterationReportSummary);
expect(iterationReportSummary.props('fullPath')).toBe(defaultProps.fullPath);
expect(iterationReportSummary.props('iterationId')).toBe(iteration.id);
expect(iterationReportSummary.props('namespaceType')).toBe(Namespace.Group);
});
it('passes correct props to IterationReportTabs', () => {
const iterationReportTabs = wrapper.find(IterationReportTabs);
expect(iterationReportTabs.props('fullPath')).toBe(defaultProps.fullPath);
expect(iterationReportTabs.props('iterationId')).toBe(iteration.id);
expect(iterationReportTabs.props('namespaceType')).toBe(Namespace.Group);
});
});
describe('loading edit form directly', () => {
......
import IterationReportSummaryCards from 'ee/iterations/components/iteration_report_summary_cards.vue';
import { mount } from '@vue/test-utils';
import { GlCard } from '@gitlab/ui';
describe('Iterations report summary cards', () => {
let wrapper;
const defaultProps = {
loading: false,
columns: [
{
title: 'Completed',
value: 10,
},
{
title: 'Incomplete',
value: 3,
},
{
title: 'Unstarted',
value: 2,
},
],
total: 15,
};
const mountComponent = (props = defaultProps) => {
wrapper = mount(IterationReportSummaryCards, {
propsData: props,
});
};
afterEach(() => {
wrapper.destroy();
wrapper = null;
});
const findCompleteCard = () =>
wrapper
.findAll(GlCard)
.at(0)
.text();
const findIncompleteCard = () =>
wrapper
.findAll(GlCard)
.at(1)
.text();
const findUnstartedCard = () =>
wrapper
.findAll(GlCard)
.at(2)
.text();
describe('with valid totals', () => {
beforeEach(() => {
mountComponent();
});
it('shows completed issues', () => {
const text = findCompleteCard();
expect(text).toContain('Completed');
expect(text).toContain('67%');
expect(text).toContain('10 of 15');
});
it('shows incomplete issues', () => {
const text = findIncompleteCard();
expect(text).toContain('Incomplete');
expect(text).toContain('20%');
expect(text).toContain('3 of 15');
});
it('shows unstarted issues', () => {
const text = findUnstartedCard();
expect(text).toContain('Unstarted');
expect(text).toContain('13%');
expect(text).toContain('2 of 15');
});
});
it('shows 0 (not NaN) when total is 0', () => {
mountComponent({
loading: false,
columns: [
{
title: 'Completed',
value: 0,
},
{
title: 'Incomplete',
value: 0,
},
{
title: 'Unstarted',
value: 0,
},
],
total: 0,
});
expect(findCompleteCard()).toContain('0 of 0');
expect(findIncompleteCard()).toContain('0 of 0');
expect(findUnstartedCard()).toContain('0 of 0');
});
});
import IterationReportSummaryClosed from 'ee/iterations/components/iteration_report_summary_closed.vue';
import { shallowMount } from '@vue/test-utils';
describe('Iterations report summary', () => {
let wrapper;
let slotSpy;
const id = 3;
const defaultProps = {
iterationId: `gid://gitlab/Iteration/${id}`,
};
const mountComponent = ({ props = defaultProps, loading = false, data = {} } = {}) => {
slotSpy = jest.fn();
wrapper = shallowMount(IterationReportSummaryClosed, {
propsData: props,
data() {
return data;
},
mocks: {
$apollo: {
queries: { issues: { loading } },
},
},
scopedSlots: {
default: slotSpy,
},
});
};
afterEach(() => {
wrapper.destroy();
wrapper = null;
});
describe('with valid totals', () => {
beforeEach(() => {
mountComponent({
data: {
issues: {
complete: 10,
incomplete: 3,
total: 13,
},
},
});
});
it('renders cards for each issue type', () => {
expect(slotSpy).toHaveBeenCalledWith({
loading: false,
columns: [
{
title: 'Completed',
value: 10,
},
{
title: 'Incomplete',
value: 3,
},
],
total: 13,
});
});
});
});
import IterationReportSummary from 'ee/iterations/components/iteration_report_summary_open.vue';
import IterationReportSummaryCards from 'ee/iterations/components/iteration_report_summary_cards.vue';
import { shallowMount } from '@vue/test-utils';
describe('Iterations report summary', () => {
let wrapper;
let slotSpy;
const id = 3;
const defaultProps = {
fullPath: 'gitlab-org',
iterationId: `gid://gitlab/Iteration/${id}`,
};
const mountComponent = ({ props = defaultProps, loading = false, data = {} } = {}) => {
slotSpy = jest.fn();
wrapper = shallowMount(IterationReportSummary, {
propsData: props,
components: {
IterationReportSummaryCards,
},
data() {
return data;
},
mocks: {
$apollo: {
queries: { issues: { loading } },
},
},
scopedSlots: {
default: slotSpy,
},
});
};
afterEach(() => {
wrapper.destroy();
wrapper = null;
});
describe('with valid totals', () => {
beforeEach(() => {
mountComponent({
data: {
issues: {
closed: 10,
assigned: 3,
open: 5,
},
},
});
});
it('passes data to cards component', () => {
expect(slotSpy).toHaveBeenCalledWith({
loading: false,
columns: [
{
title: 'Completed',
value: 10,
},
{
title: 'Incomplete',
value: 3,
},
{
title: 'Unstarted',
value: 5,
},
],
total: 18,
});
});
});
});
import { GlCard } from '@gitlab/ui';
import { mount } from '@vue/test-utils';
import IterationReportSummary from 'ee/iterations/components/iteration_report_summary.vue';
import { Namespace } from 'ee/iterations/constants';
describe('Iterations report summary', () => {
let wrapper;
const id = 3;
const fullPath = 'gitlab-org';
const defaultProps = {
fullPath,
iterationId: `gid://gitlab/Iteration/${id}`,
};
const mountComponent = ({ props = defaultProps, loading = false, data = {} } = {}) => {
wrapper = mount(IterationReportSummary, {
propsData: props,
data() {
return data;
},
mocks: {
$apollo: {
queries: { issues: { loading } },
},
},
});
};
afterEach(() => {
wrapper.destroy();
wrapper = null;
});
const findPercentageCard = () => wrapper.findAll(GlCard).at(0);
const findOpenCard = () => wrapper.findAll(GlCard).at(1);
const findInProgressCard = () => wrapper.findAll(GlCard).at(2);
const findCompletedCard = () => wrapper.findAll(GlCard).at(3);
describe('with valid totals', () => {
beforeEach(() => {
mountComponent();
wrapper.setData({
issues: {
open: 15,
assigned: 5,
closed: 10,
},
});
});
it('shows complete percentage', () => {
expect(findPercentageCard().text()).toContain('33%');
});
it('shows open issues', () => {
expect(findOpenCard().text()).toContain('Open');
expect(findOpenCard().text()).toContain('15');
});
it('shows in progress issues', () => {
expect(findInProgressCard().text()).toContain('In progress');
expect(findInProgressCard().text()).toContain('5');
});
it('shows completed issues', () => {
expect(findCompletedCard().text()).toContain('Completed');
expect(findCompletedCard().text()).toContain('10');
});
});
describe('with no issues', () => {
beforeEach(() => {
mountComponent();
wrapper.setData({
issues: {
open: 0,
assigned: 0,
closed: 0,
},
});
});
it('shows complete percentage', () => {
expect(findPercentageCard().text()).toContain('0%');
expect(findOpenCard().text()).toContain('0');
expect(findInProgressCard().text()).toContain('0');
expect(findCompletedCard().text()).toContain('0');
});
});
describe('IterationIssuesSummary query variables', () => {
const expected = {
fullPath: defaultProps.fullPath,
id,
};
describe('when group', () => {
it('has expected query variable values', () => {
mountComponent({
props: {
...defaultProps,
namespaceType: Namespace.Group,
},
});
expect(wrapper.vm.queryVariables).toEqual({
...expected,
isGroup: true,
});
});
});
describe('when project', () => {
it('has expected query variable values', () => {
mountComponent({
props: {
...defaultProps,
namespaceType: Namespace.Project,
},
});
expect(wrapper.vm.queryVariables).toEqual({
...expected,
isGroup: false,
});
});
});
});
});
......@@ -14542,6 +14542,9 @@ msgstr ""
msgid "Incompatible project"
msgstr ""
msgid "Incomplete"
msgstr ""
msgid "Indent"
msgstr ""
......@@ -29308,6 +29311,9 @@ msgstr ""
msgid "Unstar"
msgstr ""
msgid "Unstarted"
msgstr ""
msgid "Unsubscribe"
msgstr ""
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment