......@@ -355,6 +355,9 @@ it highlighted:
}
```
CAUTION: **Deprecation:**
Beginning with GitLab 12.9, dependency scanning no longer reports `undefined` severity and confidence levels.
Here is the description of the report file structure nodes and their meaning. All fields are mandatory to be present in
the report JSON unless stated otherwise. Presence of optional fields depends on the underlying analyzers being used.
......
......
......@@ -413,6 +413,9 @@ it highlighted:
}
```
CAUTION: **Deprecation:**
Beginning with GitLab 12.9, SAST no longer reports `undefined` severity and confidence levels.
Here is the description of the report file structure nodes and their meaning. All fields are mandatory in
the report JSON unless stated otherwise. Presence of optional fields depends on the underlying analyzers being used.
......
......
......@@ -279,21 +279,23 @@ This feature:
kubectl logs -n gitlab-managed-apps $(kubectl get pod -n gitlab-managed-apps -l app=nginx-ingress,component=controller --no-headers=true -o custom-columns=:metadata.name) modsecurity-log -f
```
To enable ModSecurity, check the **Enable Web Application Firewall** checkbox
when installing your [Ingress application](#ingress).
To enable WAF, switch its respective toggle to the enabled position when installing or updating [Ingress application](#ingress).
If this is your first time using GitLab's WAF, we recommend you follow the
[quick start guide](../../topics/web_application_firewall/quick_start_guide.md).
There is a small performance overhead by enabling ModSecurity. If this is
considered significant for your application, you can disable ModSecurity's
rule engine for your deployed application by setting
[the deployment variable](../../topics/autodevops/index.md)
rule engine for your deployed application in any of the following ways:
1. Setting [the deployment variable](../../topics/autodevops/index.md)
`AUTO_DEVOPS_MODSECURITY_SEC_RULE_ENGINE` to `Off`. This will prevent ModSecurity
from processing any requests for the given application or environment.
To permanently disable it, you must [uninstall](#uninstalling-applications) and
reinstall your Ingress application for the changes to take effect.
1. Switching its respective toggle to the disabled position and applying changes through the **Save changes** button. This will reinstall
Ingress with the recent changes.
![Disabling WAF](../../topics/web_application_firewall/img/guide_waf_ingress_save_changes_v12_9.png)
### JupyterHub
......
......
......@@ -585,7 +585,7 @@ Service discovery:
- [`gitlab-cookbooks` / `gitlab_consul` · GitLab](https://gitlab.com/gitlab-cookbooks/gitlab_consul)
### Haproxy
### HAProxy
High Performance TCP/HTTP Load Balancer:
......
......
doc/user/project/integrations/img/prometheus_dashboard_edit_metric_link_v_12_9.png

28.5 KiB

......@@ -172,6 +172,14 @@ There are 2 methods to specify a variable in a query or dashboard:
1. Variables can be specified using the [Liquid template format](https://help.shopify.com/en/themes/liquid/basics), for example `{{ci_environment_slug}}` ([added](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/20793) in GitLab 12.6).
1. You can also enclose it in quotation marks with curly braces with a leading percent, for example `"%{ci_environment_slug}"`. This method is deprecated though and support will be [removed in the next major release](https://gitlab.com/gitlab-org/gitlab/issues/37990).
#### Editing additional metrics from the dashboard
> [Introduced](https://gitlab.com/gitlab-org/gitlab/issues/208976) in GitLab 12.9.
You can edit existing additional custom metrics by clicking the **{ellipsis_v}** **More actions** dropdown and selecting **Edit metric**.
![Edit metric](img/prometheus_dashboard_edit_metric_link_v_12_9.png)
### Defining custom dashboards per project
> [Introduced](https://gitlab.com/gitlab-org/gitlab-foss/issues/59974) in GitLab 12.1.
......
......
......@@ -4,11 +4,12 @@ module API
module Helpers
module FileUploadHelpers
def file_is_valid?
params[:file] && params[:file]['tempfile'].respond_to?(:read)
filename = params[:file]&.original_filename
filename && ImportExportUploader::EXTENSION_WHITELIST.include?(File.extname(filename).delete('.'))
end
def validate_file!
render_api_error!('Uploaded file is invalid', 400) unless file_is_valid?
render_api_error!({ error: _('You need to upload a GitLab project export archive (ending in .gz).') }, 422) unless file_is_valid?
end
end
end
......
......
......@@ -21,10 +21,6 @@ module API
def rate_limiter
::Gitlab::ApplicationRateLimiter
end
def with_workhorse_upload_acceleration?
request.headers[Gitlab::Workhorse::INTERNAL_API_REQUEST_HEADER].present?
end
end
before do
......@@ -46,11 +42,7 @@ module API
params do
requires :path, type: String, desc: 'The new project path and name'
# TODO: remove rubocop disable - https://gitlab.com/gitlab-org/gitlab/issues/14960
# and mark WH fields as required (instead of optional) after the WH version including
# https://gitlab.com/gitlab-org/gitlab-workhorse/-/merge_requests/459
# is deployed and GITLAB_WORKHORSE_VERSION is updated accordingly.
requires :file, types: [::API::Validations::Types::WorkhorseFile, File], desc: 'The project export file to be imported' # rubocop:disable Scalability/FileUploads
requires :file, type: ::API::Validations::Types::WorkhorseFile, desc: 'The project export file to be imported'
optional :name, type: String, desc: 'The name of the project to be imported. Defaults to the path of the project if not provided.'
optional :namespace, type: String, desc: "The ID or name of the namespace that the project will be imported into. Defaults to the current user's namespace."
optional :overwrite, type: Boolean, default: false, desc: 'If there is a project in the same namespace and with the same name overwrite it'
......@@ -75,7 +67,7 @@ module API
success Entities::ProjectImportStatus
end
post 'import' do
require_gitlab_workhorse! if with_workhorse_upload_acceleration?
require_gitlab_workhorse!
key = "project_import".to_sym
......@@ -87,27 +79,19 @@ module API
Gitlab::QueryLimiting.whitelist('https://gitlab.com/gitlab-org/gitlab-foss/issues/42437')
validate_file!
namespace = if import_params[:namespace]
find_namespace!(import_params[:namespace])
else
current_user.namespace
end
# TODO: remove the condition after the WH version including
# https://gitlab.com/gitlab-org/gitlab-workhorse/-/merge_requests/459
# is deployed and GITLAB_WORKHORSE_VERSION is updated accordingly.
file = if with_workhorse_upload_acceleration?
import_params[:file] || bad_request!('Unable to process project import file')
else
validate_file!
import_params[:file]['tempfile']
end
project_params = {
path: import_params[:path],
namespace_id: namespace.id,
name: import_params[:name],
file: file,
file: import_params[:file],
overwrite: import_params[:overwrite]
}
......
......
......@@ -18284,9 +18284,6 @@ msgstr ""
msgid "Smartcard authentication failed: client certificate header is missing."
msgstr ""
 
msgid "Snippet Contents"
msgstr ""
msgid "Snippets"
msgstr ""
 
......
......
......@@ -140,6 +140,14 @@ describe SearchController do
end
end
context 'snippet search' do
it 'forces title search' do
get :show, params: { scope: 'snippet_blobs', snippets: 'true', search: 'foo' }
expect(assigns[:scope]).to eq('snippet_titles')
end
end
it 'finds issue comments' do
project = create(:project, :public)
note = create(:note_on_issue, project: project)
......
......
......@@ -32,6 +32,8 @@ shared_examples_for 'snippet editor' do
visit project_snippets_path(project)
# Wait for the SVG to ensure the button location doesn't shift
within('.empty-state') { find('img.js-lazy-loaded') }
click_on('New snippet')
wait_for_requests
end
......
......
......@@ -16,45 +16,4 @@ describe 'Search Snippets' do
expect(page).to have_link(public_snippet.title)
expect(page).to have_link(private_snippet.title)
end
it 'User searches for snippet contents' do
create(:personal_snippet,
:public,
title: 'Many lined snippet',
content: <<-CONTENT.strip_heredoc
|line one
|line two
|line three
|line four
|line five
|line six
|line seven
|line eight
|line nine
|line ten
|line eleven
|line twelve
|line thirteen
|line fourteen
CONTENT
)
sign_in create(:user)
visit dashboard_snippets_path
submit_search('line seven')
expect(page).to have_content('line seven')
# 3 lines before the matched line should be visible
expect(page).to have_content('line six')
expect(page).to have_content('line five')
expect(page).to have_content('line four')
expect(page).not_to have_content('line three')
# 3 lines after the matched line should be visible
expect(page).to have_content('line eight')
expect(page).to have_content('line nine')
expect(page).to have_content('line ten')
expect(page).not_to have_content('line eleven')
end
end
......@@ -5,13 +5,16 @@ require 'spec_helper'
describe BulkInsertSafe do
class BulkInsertItem < ApplicationRecord
include BulkInsertSafe
include ShaAttribute
validates :name, :enum_value, :secret_value, presence: true
validates :name, :enum_value, :secret_value, :sha_value, presence: true
ENUM_VALUES = {
case_1: 1
}.freeze
sha_attribute :sha_value
enum enum_value: ENUM_VALUES
attr_encrypted :secret_value,
......@@ -44,6 +47,7 @@ describe BulkInsertSafe do
t.integer :enum_value, null: false
t.text :encrypted_secret_value, null: false
t.string :encrypted_secret_value_iv, null: false
t.binary :sha_value, null: false, limit: 20
end
end
......@@ -61,7 +65,8 @@ describe BulkInsertSafe do
BulkInsertItem.new(
name: "item-#{n}",
enum_value: 'case_1',
secret_value: "my-secret"
secret_value: 'my-secret',
sha_value: '2fd4e1c67a2d28fced849ee1bb76e7391b93eb12'
)
end
end
......@@ -71,7 +76,8 @@ describe BulkInsertSafe do
BulkInsertItem.new(
name: nil, # requires `name` to be set
enum_value: 'case_1',
secret_value: "my-secret"
secret_value: 'my-secret',
sha_value: '2fd4e1c67a2d28fced849ee1bb76e7391b93eb12'
)
end
end
......@@ -112,6 +118,16 @@ describe BulkInsertSafe do
BulkInsertItem.bulk_insert!(items, batch_size: 5)
end
it 'items can be properly fetched from database' do
items = build_valid_items_for_bulk_insertion
BulkInsertItem.bulk_insert!(items)
attribute_names = BulkInsertItem.attribute_names - %w[id]
expect(BulkInsertItem.last(items.size).pluck(*attribute_names)).to eq(
items.pluck(*attribute_names))
end
it 'rolls back the transaction when any item is invalid' do
# second batch is bad
all_items = build_valid_items_for_bulk_insertion + build_invalid_items_for_bulk_insertion
......
......
......@@ -5,7 +5,6 @@ require 'spec_helper'
describe API::ProjectImport do
include WorkhorseHelpers
let(:export_path) { "#{Dir.tmpdir}/project_export_spec" }
let(:user) { create(:user) }
let(:file) { File.join('spec', 'features', 'projects', 'import_export', 'test_project_export.tar.gz') }
let(:namespace) { create(:group) }
......@@ -14,29 +13,39 @@ describe API::ProjectImport do
let(:workhorse_headers) { { 'GitLab-Workhorse' => '1.0', Gitlab::Workhorse::INTERNAL_API_REQUEST_HEADER => workhorse_token } }
before do
allow_any_instance_of(Gitlab::ImportExport).to receive(:storage_path).and_return(export_path)
stub_uploads_object_storage(FileUploader)
namespace.add_owner(user)
end
after do
FileUtils.rm_rf(export_path, secure: true)
describe 'POST /projects/import' do
subject { upload_archive(file_upload, workhorse_headers, params) }
let(:file_upload) { fixture_file_upload(file) }
let(:params) do
{
path: 'test-import',
'file.size' => file_upload.size
}
end
before do
allow(ImportExportUploader).to receive(:workhorse_upload_path).and_return('/')
end
describe 'POST /projects/import' do
it 'schedules an import using a namespace' do
stub_import(namespace)
params[:namespace] = namespace.id
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.id }
subject
expect(response).to have_gitlab_http_status(:created)
end
it 'schedules an import using the namespace path' do
stub_import(namespace)
params[:namespace] = namespace.full_path
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.full_path }
subject
expect(response).to have_gitlab_http_status(:created)
end
......@@ -46,24 +55,30 @@ describe API::ProjectImport do
it 'schedules an import using a namespace and a different name' do
stub_import(namespace)
params[:name] = expected_name
params[:namespace] = namespace.id
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.id, name: expected_name }
subject
expect(response).to have_gitlab_http_status(:created)
end
it 'schedules an import using the namespace path and a different name' do
stub_import(namespace)
params[:name] = expected_name
params[:namespace] = namespace.full_path
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.full_path, name: expected_name }
subject
expect(response).to have_gitlab_http_status(:created)
end
it 'sets name correctly' do
stub_import(namespace)
params[:name] = expected_name
params[:namespace] = namespace.full_path
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.full_path, name: expected_name }
subject
project = Project.find(json_response['id'])
expect(project.name).to eq(expected_name)
......@@ -71,8 +86,11 @@ describe API::ProjectImport do
it 'sets name correctly with an overwrite' do
stub_import(namespace)
params[:name] = 'new project name'
params[:namespace] = namespace.full_path
params[:overwrite] = true
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.full_path, name: 'new project name', overwrite: true }
subject
project = Project.find(json_response['id'])
expect(project.name).to eq('new project name')
......@@ -80,8 +98,10 @@ describe API::ProjectImport do
it 'schedules an import using the path and name explicitly set to nil' do
stub_import(namespace)
params[:name] = nil
params[:namespace] = namespace.full_path
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.full_path, name: nil }
subject
project = Project.find(json_response['id'])
expect(project.name).to eq('test-import')
......@@ -90,8 +110,9 @@ describe API::ProjectImport do
it 'schedules an import at the user namespace level' do
stub_import(user.namespace)
params[:path] = 'test-import2'
post api('/projects/import', user), params: { path: 'test-import2', file: fixture_file_upload(file) }
subject
expect(response).to have_gitlab_http_status(:created)
end
......@@ -100,7 +121,10 @@ describe API::ProjectImport do
expect_any_instance_of(ProjectImportState).not_to receive(:schedule)
expect(::Projects::CreateService).not_to receive(:new)
post api('/projects/import', user), params: { namespace: 'nonexistent', path: 'test-import2', file: fixture_file_upload(file) }
params[:namespace] = 'nonexistent'
params[:path] = 'test-import2'
subject
expect(response).to have_gitlab_http_status(:not_found)
expect(json_response['message']).to eq('404 Namespace Not Found')
......@@ -109,37 +133,40 @@ describe API::ProjectImport do
it 'does not schedule an import if the user has no permission to the namespace' do
expect_any_instance_of(ProjectImportState).not_to receive(:schedule)
post(api('/projects/import', create(:user)),
params: {
path: 'test-import3',
file: fixture_file_upload(file),
namespace: namespace.full_path
})
new_namespace = create(:group)
params[:path] = 'test-import3'
params[:namespace] = new_namespace.full_path
subject
expect(response).to have_gitlab_http_status(:not_found)
expect(json_response['message']).to eq('404 Namespace Not Found')
end
context 'if user uploads no valid file' do
let(:file) { 'README.md' }
it 'does not schedule an import if the user uploads no valid file' do
expect_any_instance_of(ProjectImportState).not_to receive(:schedule)
post api('/projects/import', user), params: { path: 'test-import3', file: './random/test' }
params[:path] = 'test-import3'
expect(response).to have_gitlab_http_status(:bad_request)
expect(json_response['error']).to eq('file is invalid')
subject
expect(response).to have_gitlab_http_status(:unprocessable_entity)
expect(json_response['message']['error']).to eq('You need to upload a GitLab project export archive (ending in .gz).')
end
end
it 'stores params that can be overridden' do
stub_import(namespace)
override_params = { 'description' => 'Hello world' }
post api('/projects/import', user),
params: {
path: 'test-import',
file: fixture_file_upload(file),
namespace: namespace.id,
override_params: override_params
}
params[:namespace] = namespace.id
params[:override_params] = override_params
subject
import_project = Project.find(json_response['id'])
expect(import_project.import_data.data['override_params']).to eq(override_params)
......@@ -149,33 +176,14 @@ describe API::ProjectImport do
stub_import(namespace)
override_params = { 'not_allowed' => 'Hello world' }
post api('/projects/import', user),
params: {
path: 'test-import',
file: fixture_file_upload(file),
namespace: namespace.id,
override_params: override_params
}
import_project = Project.find(json_response['id'])
params[:namespace] = namespace.id
params[:override_params] = override_params
expect(import_project.import_data.data['override_params']).to be_empty
end
it 'correctly overrides params during the import', :sidekiq_might_not_need_inline do
override_params = { 'description' => 'Hello world' }
subject
perform_enqueued_jobs do
post api('/projects/import', user),
params: {
path: 'test-import',
file: fixture_file_upload(file),
namespace: namespace.id,
override_params: override_params
}
end
import_project = Project.find(json_response['id'])
expect(import_project.description).to eq('Hello world')
expect(import_project.import_data.data['override_params']).to be_empty
end
context 'when target path already exists in namespace' do
......@@ -184,7 +192,9 @@ describe API::ProjectImport do
it 'does not schedule an import' do
expect_any_instance_of(ProjectImportState).not_to receive(:schedule)
post api('/projects/import', user), params: { path: existing_project.path, file: fixture_file_upload(file) }
params[:path] = existing_project.path
subject
expect(response).to have_gitlab_http_status(:bad_request)
expect(json_response['message']).to eq('Name has already been taken')
......@@ -194,7 +204,10 @@ describe API::ProjectImport do
it 'schedules an import' do
stub_import(user.namespace)
post api('/projects/import', user), params: { path: existing_project.path, file: fixture_file_upload(file), overwrite: true }
params[:path] = existing_project.path
params[:overwrite] = true
subject
expect(response).to have_gitlab_http_status(:created)
end
......@@ -207,16 +220,16 @@ describe API::ProjectImport do
end
it 'prevents users from importing projects' do
post api('/projects/import', user), params: { path: 'test-import', file: fixture_file_upload(file), namespace: namespace.id }
params[:namespace] = namespace.id
subject
expect(response).to have_gitlab_http_status(:too_many_requests)
expect(json_response['message']['error']).to eq('This endpoint has been requested too many times. Try again later.')
end
end
context 'with direct upload enabled' do
subject { upload_archive(file_upload, workhorse_headers, params) }
context 'when using remote storage' do
let(:file_name) { 'project_export.tar.gz' }
let!(:fog_connection) do
......@@ -232,21 +245,11 @@ describe API::ProjectImport do
let(:file_upload) { fog_to_uploaded_file(tmp_object) }
let(:params) do
{
path: 'test-import-project',
namespace: namespace.id,
'file.remote_id' => file_name,
'file.size' => file_upload.size
}
end
before do
allow(ImportExportUploader).to receive(:workhorse_upload_path).and_return('/')
end
it 'schedules an import' do
stub_import(namespace)
params[:namespace] = namespace.id
it 'accepts the request and stores the file' do
expect { subject }.to change { Project.count }.by(1)
subject
expect(response).to have_gitlab_http_status(:created)
end
......@@ -257,7 +260,7 @@ describe API::ProjectImport do
api("/projects/import", user),
method: :post,
file_key: :file,
params: params.merge(file: file_upload),
params: params.merge(file: file),
headers: headers,
send_rewritten_field: true
)
......@@ -301,6 +304,7 @@ describe API::ProjectImport do
expect(response).to have_gitlab_http_status(:ok)
expect(response.content_type.to_s).to eq(Gitlab::Workhorse::INTERNAL_API_CONTENT_TYPE)
expect(json_response['TempPath']).to eq(ImportExportUploader.workhorse_local_upload_path)
end
it 'rejects requests that bypassed gitlab-workhorse' do
......
......