feat: Add User Management (#2636)

*  adjust tests

* 🛠 refactor user invites to be indempotent (#2791)

* 🔐 Encrypt SMTP pass for user management backend (#2793)

* 📦 Add crypto-js to /cli

* 📦 Update package-lock.json

*  Create type for SMTP config

*  Encrypt SMTP pass

*  Update format for `userManagement.emails.mode`

*  Update format for `binaryDataManager.mode`

*  Update format for `logs.level`

* 🔥 Remove logging

* 👕 Fix lint

* 👰  n8n 2826 um wedding FE<>BE (#2789)

* remove mocks

* update authorization func

* lock down default role

* 🐛 fix requiring authentication for OPTIONS requests

* 🐛 fix cors and cookie issues in dev

* update setup route

Co-authored-by: Ben Hesseldieck <b.hesseldieck@gmail.com>

* update telemetry

* 🐛 preload role for users

* 🐛 remove auth for password reset routes

* 🐛 fix forgot-password flow

*  allow workflow tag disabling

* update telemetry init

* add reset

* clear error notifications on signin

* remove load settings from node view

* remove user id from user state

* inherit existing user props

* go back in history on button click

* use replace to force redirect

* update stories

*  add env check for tag create

* 🧪 Add `/users` tests for user management backend (#2790)

*  Refactor users namespace

*  Adjust fillout endpoint

*  Refactor initTestServer arg

* ✏️ Specify agent type

* ✏️ Specify role type

*  Tighten `/users/:id` check

*  Add initial tests

* 🚚 Reposition init server map

*  Set constants in `validatePassword()`

*  Tighten `/users/:id` check

*  Improve checks in `/users/:id`

*  Add tests for `/users/:id`

* 📦 Update package-lock.json

*  Simplify expectation

*  Reuse util for authless agent

* 🚚 Make role names consistent

* 📘 Tighten namespaces map type

* 🔥 Remove unneeded default arg

*  Add tests for `POST /users`

* 📘 Create test SMTP account type

* ✏️ Improve wording

* 🎨 Formatting

* 🔥 Remove temp fix

*  Replace helper with config call

*  Fix failing tests

* 🔥 Remove outdated test

* 🔥 Remove unused helper

*  Increase readability of domain fetcher

*  Refactor payload validation

* 🔥 Remove repetition

*  Restore logging

*  Initialize logger in tests

* 🔥 Remove redundancy from check

* 🚚 Move `globalOwnerRole` fetching to global scope

* 🔥 Remove unused imports

* 🚚 Move random utils to own module

* 🚚 Move test types to own module

* ✏️ Add dividers to utils

* ✏️ Reorder `initTestServer` param docstring

* ✏️ Add TODO comment

*  Dry up member creation

*  Tighten search criteria

* 🧪 Add expectation to `GET /users`

*  Create role fetcher utils

*  Create one more role fetch util

* 🔥 Remove unneeded DB query

* 🧪 Add expectation to `POST /users`

* 🧪 Add expectation to `DELETE /users/:id`

* 🧪 Add another expectation to `DELETE /users/:id`

* 🧪 Add expectations to `DELETE /users/:id`

* 🧪 Adjust expectations in `POST /users/:id`

* 🧪 Add expectations to `DELETE /users/:id`

* 👕 Fix build

*  Update method

* 📘 Fix `userToDelete` type

*  Refactor `createAgent()`

*  Make role fetching global

*  Optimize roles fetching

*  Centralize member creation

*  Refactor truncation helper

* 🧪 Add teardown to `DELETE /users/:id`

* 🧪 Add DB expectations to users tests

* 🔥 Remove pass validation due to hash

* ✏️ Improve pass validation error message

*  Improve owner pass validation

*  Create logger initialization helper

*  Optimize helpers

*  Restructure `getAllRoles` helper

* 🧪 Add password reset flow tests for user management backend (#2807)

*  Refactor users namespace

*  Adjust fillout endpoint

*  Refactor initTestServer arg

* ✏️ Specify agent type

* ✏️ Specify role type

*  Tighten `/users/:id` check

*  Add initial tests

* 🚚 Reposition init server map

*  Set constants in `validatePassword()`

*  Tighten `/users/:id` check

*  Improve checks in `/users/:id`

*  Add tests for `/users/:id`

* 📦 Update package-lock.json

*  Simplify expectation

*  Reuse util for authless agent

* 🚚 Make role names consistent

* 📘 Tighten namespaces map type

* 🔥 Remove unneeded default arg

*  Add tests for `POST /users`

* 📘 Create test SMTP account type

* ✏️ Improve wording

* 🎨 Formatting

* 🔥 Remove temp fix

*  Replace helper with config call

*  Fix failing tests

* 🔥 Remove outdated test

*  Add tests for password reset flow

* ✏️ Fix test wording

*  Set password reset namespace

* 🔥 Remove unused helper

*  Increase readability of domain fetcher

*  Refactor payload validation

* 🔥 Remove repetition

*  Restore logging

*  Initialize logger in tests

* 🔥 Remove redundancy from check

* 🚚 Move `globalOwnerRole` fetching to global scope

* 🔥 Remove unused imports

* 🚚 Move random utils to own module

* 🚚 Move test types to own module

* ✏️ Add dividers to utils

* ✏️ Reorder `initTestServer` param docstring

* ✏️ Add TODO comment

*  Dry up member creation

*  Tighten search criteria

* 🧪 Add expectation to `GET /users`

*  Create role fetcher utils

*  Create one more role fetch util

* 🔥 Remove unneeded DB query

* 🧪 Add expectation to `POST /users`

* 🧪 Add expectation to `DELETE /users/:id`

* 🧪 Add another expectation to `DELETE /users/:id`

* 🧪 Add expectations to `DELETE /users/:id`

* 🧪 Adjust expectations in `POST /users/:id`

* 🧪 Add expectations to `DELETE /users/:id`

* 📘 Add namespace name to type

* 🚚 Adjust imports

*  Optimize `globalOwnerRole` fetching

* 🧪 Add expectations

* 👕 Fix build

* 👕 Fix build

*  Update method

*  Update method

* 🧪 Fix `POST /change-password` test

* 📘 Fix `userToDelete` type

*  Refactor `createAgent()`

*  Make role fetching global

*  Optimize roles fetching

*  Centralize member creation

*  Refactor truncation helper

* 🧪 Add teardown to `DELETE /users/:id`

* 🧪 Add DB expectations to users tests

*  Refactor as in users namespace

* 🧪 Add expectation to `POST /change-password`

* 🔥 Remove pass validation due to hash

* ✏️ Improve pass validation error message

*  Improve owner pass validation

*  Create logger initialization helper

*  Optimize helpers

*  Restructure `getAllRoles` helper

*  Update `truncate` calls

* 🐛 return 200 for non-existing user

*  fix tests for forgot-password and user creation

* Update packages/editor-ui/src/components/MainSidebar.vue

Co-authored-by: Ahsan Virani <ahsan.virani@gmail.com>

* Update packages/editor-ui/src/components/Telemetry.vue

Co-authored-by: Ahsan Virani <ahsan.virani@gmail.com>

* Update packages/editor-ui/src/plugins/telemetry/index.ts

Co-authored-by: Ahsan Virani <ahsan.virani@gmail.com>

* Update packages/editor-ui/src/plugins/telemetry/index.ts

Co-authored-by: Ahsan Virani <ahsan.virani@gmail.com>

* Update packages/editor-ui/src/plugins/telemetry/index.ts

Co-authored-by: Ahsan Virani <ahsan.virani@gmail.com>

* 🚚 Fix imports

*  reset password just if password exists

* Fix validation at `PATCH /workfows/:id` (#2819)

* 🐛 Validate entity only if workflow

* 👕 Fix build

* 🔨 refactor response from user creation

* 🐛 um email invite fix (#2833)

* update users invite

* fix notificaitons stacking on top of each other

* remove unnessary check

* fix type issues

* update structure

* fix types

* 🐘  database migrations UM + password reset expiration (#2710)

* Add table prefix and assign existing workflows and credentials to owner for sqlite

* Added user management migration to MySQL

* Fixed some missing table prefixes and removed unnecessary user id

* Created migration for postgres and applies minor fixes

* Fixed migration for sqlite by removing the unnecessary index and for mysql by removing unnecessary user data

* Added password reset token expiration

* Addressing comments made by Ben

* ️ add missing tablePrefix

*  fix tests + add tests for expiring pw-reset-token

Co-authored-by: Ben Hesseldieck <b.hesseldieck@gmail.com>

*  treat skipped personalizationSurvey as not answered

* 🐛 removing active workflows when deleting user, 🐛 fix reinvite, 🐛 fix resolve-signup-token, 🐘 remove workflowname uniqueness

*  Add DB state check tests (#2841)

* 🔥 Remove unneeded import

* 🔥 Remove unneeded vars

* ✏️ Improve naming

* 🧪 Add expectations to `POST /owner`

* 🧪 Add expectations to `PATCH /me`

* 🧪 Add expectation to `PATCH /me/password`

* ✏️ Clarify when owner is owner shell

* 🧪 Add more expectations

*  Restore package-lock to parent branch state

* Add logging to user management endpoints v2 (#2836)

*  Initialize logger in tests

*  Add logs to mailer

*  Add logs to middleware

*  Add logs to me endpoints

*  Add logs to owner endpoints

*  Add logs to pass flow endpoints

*  Add logs to users endpoints

* 📘 Improve typings

*  Merge two logs into one

*  Adjust log type

*  Add password reset email log

* ✏️ Reword log message

*  Adjust log meta object

*  Add total to log

* ✏️ Add detail to log message

* ✏️ Reword log message

* ✏️ Reword log message

* 🐛 Make total users to set up accurate

* ✏️ Reword `Logger.debug()` messages

* ✏️ Phrasing change for consistency

* 🐛 Fix ID overridden in range query

* 🔨 small refactoring

* 🔐 add auth to push-connection

* 🛠   Create credentials namespace and add tests (#2831)

* 🧪 Fix failing test

* 📘 Improve `createAgent` signature

* 🚚 Fix `LoggerProxy` import

*  Create credentials endpoints namespace

* 🧪 Set up initial tests

*  Add validation to model

*  Adjust validation

* 🧪 Add test

* 🚚 Sort creds endpoints

* ✏️ Plan out pending tests

* 🧪 Add deletion tests

* 🧪 Add patch tests

* 🧪 Add get cred tests

* 🚚 Hoist import

* ✏️ Make test descriptions consistent

* ✏️ Adjust description

* 🧪 Add missing test

* ✏️ Make get descriptions consistent

*  Undo line break

*  Refactor to simplify `saveCredential`

* 🧪 Add non-owned tests for owner

* ✏️ Improve naming

* ✏️ Add clarifying comments

* 🚚 Improve imports

*  Initialize config file

* 🔥 Remove unneeded import

* 🚚 Rename dir

*  Adjust deletion call

*  Adjust error code

* ✏️ Touch up comment

*  Optimize fetching with `@RelationId`

* 🧪 Add expectations

*  Simplify mock calls

* 📘 Set deep readonly to object constants

* 🔥 Remove unused param and encryption key

*  Add more `@RelationId` calls in models

*  Restore

* 🐛 no auth for .svg

* 🛠 move auth cookie name to constant; 🐛 fix auth for push-connection

*  Add auth middleware tests (#2853)

*  Simplify existing suite

* 🧪 Validate that auth cookie exists

* ✏️ Move comment

* 🔥 Remove unneeded imports

* ✏️ Add clarifying comments

* ✏️ Document auth endpoints

* 🧪 Add middleware tests

* ✏️ Fix typos

Co-authored-by: Ben Hesseldieck <1849459+BHesseldieck@users.noreply.github.com>

* 🔥 Remove test description wrappers (#2874)

* 🔥 Remove /owner test wrappers

* 🔥 Remove auth middleware test wrappers

* 🔥 Remove auth endpoints test wrappers

* 🔥 Remove overlooked middleware wrappers

* 🔥 Remove me namespace test wrappers

Co-authored-by: Ben Hesseldieck <b.hesseldieck@gmail.com>

*  Runtime checks for credentials load and execute workflows (#2697)

* Runtime checks for credentials load and execute workflows

* Fixed from reviewers

* Changed runtime validation for credentials to be on start instead of on demand

* Refactored validations to use user id instead of whole User instance

* Removed user entity from workflow project because it is no longer needed

* General fixes and improvements to runtime checks

* Remove query builder and improve styling

* Fix lint issues

*  remove personalizationAnswers when fetching all users

*  fix failing get all users test

*  check authorization routes also for authentication

* 🐛 fix defaults in reset command

* 🛠 refactorings from walkthrough (#2856)

*  Make `getTemplate` async

*  Remove query builder from `getCredentials`

*  Add save manual executions log message

*  Restore and hide migrations logs

*  Centralize ignore paths check

* 👕 Fix build

* 🚚 Rename `hasOwner` to `isInstanceOwnerSetUp`

*  Add `isSetUp` flag to `User`

*  Add `isSetUp` to FE interface

*  Adjust `isSetUp` checks on FE

* 👕 Fix build

*  Adjust `isPendingUser()` check

* 🚚 Shorten helper name

*  Refactor as `isPending` per feedback

* ✏️ Update log message

*  Broaden check

* 🔥 Remove unneeded relation

*  Refactor query

* 🔥 Re-remove logs from migrations

* 🛠 set up credentials router (#2882)

*  Refactor creds endpoints into router

* 🧪 Refactor creds tests to use router

* 🚚 Rename arg for consistency

* 🚚 Move `credentials.api.ts` outside /public

* 🚚 Rename constant for consistency

* 📘 Simplify types

* 🔥 Remove unneeded arg

* 🚚 Rename router to controller

*  Shorten endpoint

*  Update `initTestServer()` arg

*  Mutate response body in GET /credentials

* 🏎 improve performance of type cast for FE

Co-authored-by: Ben Hesseldieck <b.hesseldieck@gmail.com>

* 🐛 remove GET /login from auth

* 🔀 merge master + FE update (#2905)

*  Add Templates (#2720)

* Templates Bugs / Fixed Various Bugs / Multiply Api Request, Carousel Gradient, Core Nodes Filters ...

* Updated MainSidebar Paddings

* N8N-Templates Bugfixing - Remove Unnecesairy Icon (Shape), Refatctor infiniteScrollEnabled Prop + updated infiniterScroll functinality

* N8N-2853 Fixed Carousel Arrows Bug after Cleaning the SearchBar

* fix telemetry init

* fix search tracking issues

* N8N-2853 Created FilterTemplateNode Constant Array, Filter PlayButton and WebhookRespond from Nodes, Added Box for showing more nodes inside TemplateList, Updated NewWorkflowButton to primary, Fixed Markdown issue with Code

* N8N-2853 Removed Placeholder if Workflows Or Collections are not found, Updated the Logic

* fix telemetry events

* clean up session id

* update user inserted event

* N8N-2853 Fixed Categories to Moving if the names are long

* Add todos

* Update Routes on loading

* fix spacing

* Update Border Color

* Update Border Readius

* fix filter fn

* fix constant, console error

* N8N-2853 PR Fixes, Refactoring, Removing unnecesairy code ..

* N8N-2853 PR Fixes - Editor-ui Fixes, Refactoring, Removing Dead Code ...

* N8N-2853 Refactor Card to LongCard

* clean up spacing, replace css var

* clean up spacing

* set categories as optional in node

* replace vars

* refactor store

* remove unnesssary import

* fix error

* fix templates view to start

* add to cache

* fix coll view data

* fix categories

* fix category event

* fix collections carousel

* fix initial load and search

* fix infinite load

* fix query param

* fix scrolling issues

* fix scroll to top

* fix search

* fix collections search

* fix navigation bug

* rename view

* update package lock

* rename workflow view

* rename coll view

* update routes

* add wrapper component

* set session id

* fix search tracking

* fix session tracking

* remove deleted mutation

* remove check for unsupported nodes

* refactor filters

* lazy load template

* clean up types

* refactor infinte scroll

* fix end of search

* Fix spacing

* fix coll loading

* fix types

* fix coll view list

* fix navigation

* rename types

* rename state

* fix search responsiveness

* fix coll view spacing

* fix search view spacing

* clean up views

* set background color

* center page not vert

* fix workflow view

* remove import

* fix background color

* fix background

* clean props

* clean up imports

* refactor button

* update background color

* fix spacing issue

* rename event

* update telemetry event

* update endpoints, add loading view, check for endpoint health

* remove conolse log

* N8N-2853 Fixed Menu Items Padding

* replace endpoints

* fix type issues

* fix categories

* N8N-2853 Fixed ParameterInput Placeholder after ElementUI Upgrade

* update createdAt

*  Fix placeholder in creds config modal

* ✏️ Adjust docstring to `credText` placeholder version

* N8N-2853 Optimized

* N8N-2853 Optimized code

*  Add deployment type to FE settings

*  Add deployment type to interfaces

* N8N-2853 Removed Animated prop from components

*  Add deployment type to store module

*  Create hiring banner

*  Display hiring banner

*  Undo unrelated change

* N8N-2853 Refactor TemplateFilters

*  Fix indentation

* N8N-2853 Reorder items / TemplateList

* 👕 Fix lint

* N8N-2853 Refactor TemplateFilters Component

* N8N-2853 Reorder TemplateList

* refactor template card

* update timeout

* fix removelistener

* fix spacing

* split enabled from offline

* add spacing to go back

* N8N-2853 Fixed Screens for Tablet & Mobile

* N8N-2853 Update Stores Order

* remove image componet

* remove placeholder changes

* N8N-2853 Fixed Chinnese Placeholders for El Select Component that comes from the Library Upgrade

* N8N-2853 Fixed Vue Agile Console Warnings

* N8N-2853 Update Collection Route

* ✏️ Update jobs URL

* 🚚 Move logging to root component

*  Refactor `deploymentType` to `isInternalUser`

*  Improve syntax

* fix cut bug in readonly view

* N8N-3012 Fixed Details section in templates with lots of description, Fixed Mardown Block with overflox-x

* N8N-3012 Increased Font-size, Spacing and Line-height of the Categories Items

* N8N-3012 Fixed Vue-agile client width error on resize

* only delay redirect for root path

* N8N-3012 Fixed Carousel Arrows that Disappear

* N8N-3012 Make Loading Screen same color as Templates

* N8N-3012 Markdown renders inline block as block code

* add offline warning

* hide log from workflow iframe

* update text

* make search button larger

* N8N-3012 Categories / Tags extended all the way in details section

* load data in cred modals

* remove deleted message

* add external hook

* remove import

* update env variable description

* fix markdown width issue

* disable telemetry for demo, add session id to template pages

* fix telemetery bugs

* N8N-3012 Not found Collections/Wokrkflow

* N8N-3012 Checkboxes change order when categories are changed

* N8N-3012 Refactor SortedCategories inside TemplateFilters component

* fix firefox bug

* add telemetry requirements

* add error check

* N8N-3012 Update GoBackButton to check if Route History is present

* N8N-3012 Fixed WF Nodes Icons

* hide workflow screenshots

* remove unnessary mixins

* rename prop

* fix design a bit

* rename data

* clear workspace on destroy

* fix copy paste bug

* fix disabled state

* N8N-3012 Fixed Saving/Leave without saving Modal

* fix telemetry issue

* fix telemetry issues, error bug

* fix error notification

* disable workflow menu items on templates

* fix i18n elementui issue

* Remove Emit - NodeType from HoverableNodeIcon component

* TechnicalFixes: NavigateTo passed down as function should be helper

* TechnicalFixes: Update NavigateTo function

* TechnicalFixes: Add FilterCoreNodes directly as function

* check for empty connecitions

* fix titles

* respect new lines

* increase categories to be sliced

* rename prop

* onUseWorkflow

* refactor click event

* fix bug, refactor

* fix loading story

* add default

* fix styles at right level of abstraction

* add wrapper with width

* remove loading blocks component

* add story

* rename prop

* fix spacing

* refactor tag, add story

* move margin to container

* fix tag redirect, remove unnessary check

* make version optional

* rename view

* move from workflows to templates store

* remove unnessary change

* remove unnessary css

* rename component

* refactor collection card

* add boolean to prevent shrink

* clean up carousel

* fix redirection bug on save

* remove listeners to fix multiple listeners bug

* remove unnessary types

* clean up boolean set

* fix node select bug

* rename component

* remove unnessary class

* fix redirection bug

* remove unnessary error

* fix typo

* fix blockquotes, pre

* refactor markdown rendering

* remove console log

* escape markdown

* fix safari bug

* load active workflows to fix modal bug

* ⬆️ Update package-lock.json file

*  Add n8n version as header

Co-authored-by: Mutasem Aldmour <4711238+mutdmour@users.noreply.github.com>
Co-authored-by: Mutasem <mutdmour@gmail.com>
Co-authored-by: Iván Ovejero <ivov.src@gmail.com>
Co-authored-by: Jan Oberhauser <jan.oberhauser@gmail.com>

* 🔖 Release n8n-workflow@0.88.0

* ⬆️ Set n8n-workflow@0.88.0 on n8n-core

* 🔖 Release n8n-core@0.106.0

* ⬆️ Set n8n-core@0.106.0 and n8n-workflow@0.88.0 on n8n-node-dev

* 🔖 Release n8n-node-dev@0.45.0

* ⬆️ Set n8n-core@0.106.0 and n8n-workflow@0.88.0 on n8n-nodes-base

* 🔖 Release n8n-nodes-base@0.163.0

* 🔖 Release n8n-design-system@0.12.0

* ⬆️ Set n8n-design-system@0.12.0 and n8n-workflow@0.88.0 on n8n-editor-ui

* 🔖 Release n8n-editor-ui@0.132.0

* ⬆️ Set n8n-core@0.106.0, n8n-editor-ui@0.132.0, n8n-nodes-base@0.163.0 and n8n-workflow@0.88.0 on n8n

* 🔖 Release n8n@0.165.0

* fix default user bug

* fix bug

* update package lock

* fix duplicate import

* fix settings

* fix templates access

Co-authored-by: Oliver Trajceski <olivertrajceski@yahoo.com>
Co-authored-by: Iván Ovejero <ivov.src@gmail.com>
Co-authored-by: Jan Oberhauser <jan.oberhauser@gmail.com>

*  n8n 2952 personalisation (#2911)

* refactor/update survey

* update customers

* Fix up personalization survey

* fix recommendation logic

* set to false

* hide suggested nodes when empty

* use keys

* add missing logic

* switch types

* Fix logic

* remove unused constants

* add back constant

* refactor filtering inputs

* hide last input on personal

* fix other

*  add current pw check for change password (#2912)

* fix back button

* Add current password input

* add to modal

* update package.json

* delete mock file

* delete mock file

* get settings func

* update router

* update package lock

* update package lock

* Fix invite text

* update error i18n

* open personalization on search if not set

* update error view i18n

* update change password

* update settings sidebar

* remove import

* fix sidebar

* 🥅 fix error for credential/workflow not found

* update invite modal

*  persist skipping owner setup (#2894)

* 🚧 added skipInstanceOwnerSetup to DB + route to save skipping

*  skipping owner setup persists

*  add tests for authorization and /owner/skip-setup

* 🛠 refactor FE settings getter

* 🛠 move setting setup stop to owner creation

* 🐛 fix wrong setting of User.isPending

* 🐛 fix isPending

* 🏷 add isPending to PublicUser

* 🐛 fix unused import

* update delete modal

* change password modal

* remove _label

* sort keys

* remove key

* update key names

* fix test endpoint

* 🥅 Handle error workflows permissions (#2908)

* Handle error workflows permissions

* Fixed wrong query format

* 🛠 refactor query

Co-authored-by: Ben Hesseldieck <1849459+BHesseldieck@users.noreply.github.com>

* fix ts issue

* fix list after ispending changes

* fix error page bugs

* fix error redirect

* fix notification

* 🐛 fix survey import in migration

* fix up spacing

* update keys spacing

* update keys

* add space

* update key

* fix up more spacing

* 🔐 add current password (#2919)

* add curr pass

* update key names

* 🐛 stringify tag ids

* 🔐 check current password before update

* add package lock

* fix dep version

* update version

* 🐛 fix access for instance owner to credentials (#2927)

* 🛠 stringify tag id on entity

* 🔐 Update password requirements (#2920)

*  Update password requirements

*  Adjust random helpers

*  fix tests for currentPassword check

* change redirection, add homepage

* fix error view redirection

* updated wording

* fix setup redirection

* update validator

* remove successfully

* update consumers

* update settings redirect

* on signup, redirect to homepage

* update empty state

* add space to emails

* remove brackets

* add opacity

* update spacing

* remove border from last user

* personal details updated

* update redirect on sign up

* prevent text wrap

* fix notification title line height

* remove console log

* 🐘 Support testing with Postgres and MySQL (#2886)

* 🗃️ Fix Postgres migrations

*  Add DB-specific scripts

*  Set up test connections

*  Add Postgres UUID check

* 🧪 Make test adjustments for Postgres

*  Refactor connection logic

*  Set up double init for Postgres

* ✏️ Add TODOs

*  Refactor DB dropping logic

*  Implement global teardown

*  Create TypeORM wrappers

*  Initial MySQL setup

*  Clean up Postgres connection options

*  Simplify by sharing bootstrap connection name

* 🗃️ Fix MySQL migrations

* 🔥 Remove comments

*  Use ES6 imports

* 🔥 Remove outdated comments

*  Centralize bootstrap connection name handles

*  Centralize database types

* ✏️ Update comment

* 🚚 Rename `findRepository`

* 🚧 Attempt to truncate MySQL

*  Implement creds router

* 🐛 Fix duplicated MySQL bootstrap

* 🐛 Fix misresolved merge conflict

* 🗃️ Fix tags migration

* 🗃️ Fix MySQL UM migration

* 🐛 Fix MySQL parallelization issues

* 📘 Augment TypeORM to prevent error

* 🔥 Remove comments

*  Support one sqlite DB per suite run

* 🚚 Move `testDb` to own module

* 🔥 Deduplicate bootstrap Postgres logic

* 🔥 Remove unneeded comment

*  Make logger init calls consistent

* ✏️ Improve comment

* ✏️ Add dividers

* 🎨 Improve formatting

* 🔥 Remove duplicate MySQL global setting

* 🚚 Move comment

*  Update default test script

* 🔥 Remove unneeded helper

*  Unmarshal answers from Postgres

* 🐛 Phase out `isTestRun`

*  Refactor `isEmailSetup`

* 🔥 Remove unneeded imports

*  Handle bootstrap connection errors

* 🔥 Remove unneeded imports

* 🔥 Remove outdated comments

* ✏️ Fix typos

* 🚚 Relocate `answersFormatter`

*  Undo package.json miscommit

* 🔥 Remove unneeded import

*  Refactor test DB prefixing

*  Add no-leftover check to MySQL

* 📦 Update package.json

*  Autoincrement on simulated MySQL truncation

* 🔥 Remove debugging queries

* ✏️ fix email template link expiry

* 🔥 remove unused import

*  fix testing email not sent error

* fix duplicate import

* add package lock

* fix export

* change opacity

* fix text issue

* update action box

* update error title

* update forgot password

* update survey

* update product text

* remove unset fields

* add category to page events

* remove duplicate import

* update key

* update key

* update label type

* 🎨 um/fe review (#2946)

* 🐳 Update Node.js versions of Docker images to 16

* 🐛 Fix that some keyboard shortcuts did no longer work

* N8N-3057 Fixed Keyboard shortcuts no longer working on / Fixed callDebounced function

* N8N-3057 Update Debounce Function

* N8N-3057 Refactor callDebounce function

* N8N-3057 Update Dobounce Function

* 🐛 Fix issue with tooltips getting displayed behind node details view

* fix tooltips z-index

* move all element ui components

* update package lock

* 🐛 Fix credentials list load issue (#2931)

* always fetch credentials

* only fetch credentials once

*  Allow to disable hiring banner (#2902)

*  Add flag

*  Adjust interfaces

*  Adjust store module

*  Adjust frontend settings

*  Adjust frontend display

* 🐛 Fix issue that ctrl + o did behave wrong on workflow templates page (#2934)

* N8N-3094 Workflow Templates cmd-o acts on the Preview/Iframe

* N8N-3094 Workflow Templates cmd-o acts on the Preview/Iframe

* disable shortcuts for preview

Co-authored-by: Mutasem <mutdmour@gmail.com>

* ⬆️ Update package-lock.json file

* 🐛 Fix sorting by field in Baserow Node (#2942)

This fixes a bug which currently leads to the "Sorting" option of the node to be ignored.

* 🐛 Fix some i18n line break issues

*  Add Odoo Node (#2601)

* added odoo scaffolding

* update getting data from odoo instance

* added scaffolding for main loop and request functions

* added functions for CRUD opperations

* improoved error handling for odooJSONRPCRequest

* updated odoo node and fixing nodelinter issues

* fixed alpabetical order

* fixed types in odoo node

* fixing linter errors

* fixing linter errors

* fixed data shape returned from man loop

* updated node input types, added fields list to models

* update when custom resource is selected options for fields list will be populated dynamicly

* minor fixes

* 🔨 fixed credential test, updating CRUD methods

* 🔨 added additional fields to crm resource

* 🔨 added descriptions, fixed credentials test bug

* 🔨 standardize node and descriptions design

* 🔨 removed comments

* 🔨 added pagination to getAll operation

*  removed leftover function from previous implementation, removed required from optional fields

*  fixed id field, added indication of type and if required to field description, replaced string input in filters to fetched list of fields

* 🔨 fetching list of models from odoo, added selection of fields to be returned to predefined models, fixes accordingly to review

*  Small improvements

* 🔨 extracted adress fields into collection, changed fields to include in descriptions, minor tweaks

*  Improvements

* 🔨 working on review

* 🔨 fixed linter errors

* 🔨 review wip

* 🔨 review wip

* 🔨 review wip

*  updated display name for URL in credentials

* 🔨 added checks for valid id to delete and update

*  Minor improvements

Co-authored-by: ricardo <ricardoespinoza105@gmail.com>
Co-authored-by: Jan Oberhauser <jan.oberhauser@gmail.com>

* 🐛 Handle Wise SCA requests (#2734)

*  Improve Wise error message after previous change

* fix duplicate import

* add package lock

* fix export

* change opacity

* fix text issue

* update action box

* update error title

* update forgot password

* update survey

* update product text

* remove unset fields

* add category to page events

* remove duplicate import

* update key

* update key

Co-authored-by: Jan Oberhauser <jan.oberhauser@gmail.com>
Co-authored-by: Oliver Trajceski <olivertrajceski@yahoo.com>
Co-authored-by: Iván Ovejero <ivov.src@gmail.com>
Co-authored-by: Tom <19203795+that-one-tom@users.noreply.github.com>
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
Co-authored-by: ricardo <ricardoespinoza105@gmail.com>
Co-authored-by: pemontto <939704+pemontto@users.noreply.github.com>

* Move owner skip from settings

* 🐛 SMTP fixes (#2937)

* 🔥 Remove `UM_` from SMTP env vars

* 🔥 Remove SMTP host default value

*  Update sender value

*  Update invite template

*  Update password reset template

*  Update `N8N_EMAIL_MODE` default value

* 🔥 Remove `EMAIL` from all SMTP vars

*  Implement `verifyConnection()`

* 🚚 Reposition comment

* ✏️ Fix typo

* ✏️ Minor env var documentation improvements

* 🎨 Fix spacing

* 🎨 Fix spacing

* 🗃️ Remove SMTP settings cache

*  Adjust log message

*  Update error message

* ✏️ Fix template typo

* ✏️ Adjust wording

*  Interpolate email into success toast

* ✏️ Adjust base message in `verifyConnection()`

*  Verify connection on password reset

*  Bring up POST /users SMTP check

* 🐛 remove cookie if cookie is not valid

*  verify connection on instantiation

Co-authored-by: Ben Hesseldieck <b.hesseldieck@gmail.com>

* 🔊 create logger helper for migrations (#2944)

* 🔥 remove unused database

* 🔊 add migration logging for sqlite

* 🔥 remove unnecessary index creation

* ️ change log level to warn

* 🐛 Fix issue with workflow process to initialize db connection correctly (#2948)

* ✏️ update error messages for webhhook run/activation

* 📈 Implement telemetry events (#2868)

* Implement basic telemetry events

* Fixing user id as part of the telemetry data

* Added user id to be part of the tracked data

*  Create telemetry mock

* 🧪 Fix tests with telemetry mock

* 🧪 Fix missing key in authless endpoint

* 📘 Create authless request type

* 🔥 Remove log

* 🐛 Fix `migration_strategy` assignment

* 📘 Remove `instance_id` from `ITelemetryUserDeletionData`

*  Simplify concatenation

*  Simplify `track()` call signature

* Fixed payload of telemetry to always include user_id

* Fixing minor issues

Co-authored-by: Iván Ovejero <ivov.src@gmail.com>

* 🔊 Added logs to credentials, executions and workflows (#2915)

* Added logs to credentials, executions and workflows

* Some updates according to ivov's feedback

*  update log levels

*  fix tests

Co-authored-by: Ben Hesseldieck <b.hesseldieck@gmail.com>

* 🐛 fix telemetry error

* fix conflicts with master

* fix duplicate

* add package-lock

* 🐛 Um/fixes (#2952)

* add initials to avatar

* redirect to signin if invalid token

* update pluralization

* add auth page category

* data transferred

* touch up setup page

* update button to add cursor

* fix personalization modal not closing

* ✏️ fix environment name

* 🐛 fix disabling UM

* 🐛 fix email setup flag

* 🐛 FE fixes 1 (#2953)

* add initials to avatar

* redirect to signin if invalid token

* update pluralization

* add auth page category

* data transferred

* touch up setup page

* update button to add cursor

* fix personalization modal not closing

* capitalize labels, refactor text

* Fixed the issue with telemetry data missing for personalization survey

* Changed invite email text

* 🐛 Fix quotes issue with postgres migration (#2958)

* Changed text for invite link

* 🐛 fix reset command for mysql

*  fix race condition in test DB creation

* 🔐 block user creation if UM is disabled

* 🥅 improve smtp setup issue error

*  update error message

* refactor route rules

* set package lock

* fix access

* remove capitalize

* update input labels

* refactor heading

* change span to fragment

* add route types

* refactor views

*  fix increase timeout for mysql

*  correct logic of error message

* refactor view names

*  update randomString

* 📈 Added missing event regarding failed emails (#2964)

* replace label with info

* 🛠 refactor JWT-secret creation

* remove duplicate key

* remove unused part

* remove semicolon

* fix up i18n pattern

* update translation keys

* update urls

* support i18n in nds

* fix how external keys are handled

* add source

* 💥 update timestamp of UM migration

* ✏️ small message updates

* fix tracking

* update notification line-height

* fix avatar opacity

* fix up empty state

* shift focus to input

* 🔐 Disable basic auth after owner has been set up (#2973)

* Disable basic auth after owner has been set up

* Remove unnecessary comparison

* rename modal title

* 🐛 use pgcrypto extension for uuid creation (#2977)

* 📧 Added public url variable for emails (#2967)

* Added public url variable for emails

* Fixed base url for reset password - the current implementation overrides possibly existing path

* Change variable name to editorUrl

* Using correct name editorUrl for emails

* Changed variable description

* Improved base url naming and appending path so it remains consistent

* Removed trailing slash from editor base url

* 🌐 fix i18n pattern (#2970)

* fix up i18n pattern

* update translation keys

* update urls

* support i18n in nds

* fix how external keys are handled

* add source

* Um/fixes 1000 (#2980)

* fix select issue

* 😫 hacky solution to circumvent pgcrypto (#2979)

* fix owner bug after transfer. always fetch latest credentials

* add confirmation modal to setup

* Use webhook url as fallback when editor url is not defined

* fix enter bug

* update modal

* update modal

* update modal text, fix bug in settings view

* Updating editor url to not append path

* rename keys

Co-authored-by: Iván Ovejero <ivov.src@gmail.com>
Co-authored-by: Mutasem Aldmour <4711238+mutdmour@users.noreply.github.com>
Co-authored-by: Mutasem <mutdmour@gmail.com>
Co-authored-by: Ahsan Virani <ahsan.virani@gmail.com>
Co-authored-by: Omar Ajoue <krynble@gmail.com>
Co-authored-by: Oliver Trajceski <olivertrajceski@yahoo.com>
Co-authored-by: Jan Oberhauser <jan.oberhauser@gmail.com>
Co-authored-by: Tom <19203795+that-one-tom@users.noreply.github.com>
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
Co-authored-by: ricardo <ricardoespinoza105@gmail.com>
Co-authored-by: pemontto <939704+pemontto@users.noreply.github.com>
This commit is contained in:
Ben Hesseldieck 2022-03-14 14:46:32 +01:00 committed by GitHub
parent 761720621e
commit 7264239b83
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
262 changed files with 17737 additions and 3294 deletions

View file

@ -257,6 +257,11 @@ module.exports = {
*/ */
'@typescript-eslint/no-floating-promises': ['error', { ignoreVoid: true }], '@typescript-eslint/no-floating-promises': ['error', { ignoreVoid: true }],
/**
* https://github.com/typescript-eslint/typescript-eslint/blob/v4.33.0/packages/eslint-plugin/docs/rules/no-namespace.md
*/
'@typescript-eslint/no-namespace': 'off',
/** /**
* https://eslint.org/docs/1.0.0/rules/no-throw-literal * https://eslint.org/docs/1.0.0/rules/no-throw-literal
*/ */

2246
package-lock.json generated

File diff suppressed because it is too large Load diff

View file

@ -53,3 +53,14 @@ declare module 'json-diff' {
} }
export function diff(obj1: unknown, obj2: unknown, diffOptions: IDiffOptions): string; export function diff(obj1: unknown, obj2: unknown, diffOptions: IDiffOptions): string;
} }
type SmtpConfig = {
host: string;
port: number;
secure: boolean;
auth: {
user: string;
pass: string;
};
sender: string;
};

View file

@ -28,6 +28,7 @@ import {
import { getLogger } from '../src/Logger'; import { getLogger } from '../src/Logger';
import config = require('../config'); import config = require('../config');
import { getInstanceOwner } from '../src/UserManagement/UserManagementHelper';
export class Execute extends Command { export class Execute extends Command {
static description = '\nExecutes a given workflow'; static description = '\nExecutes a given workflow';
@ -169,11 +170,13 @@ export class Execute extends Command {
} }
try { try {
const user = await getInstanceOwner();
const runData: IWorkflowExecutionDataProcess = { const runData: IWorkflowExecutionDataProcess = {
executionMode: 'cli', executionMode: 'cli',
startNodes: [startNode.name], startNodes: [startNode.name],
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion // eslint-disable-next-line @typescript-eslint/no-non-null-assertion
workflowData: workflowData!, workflowData: workflowData!,
userId: user.id,
}; };
const workflowRunner = new WorkflowRunner(); const workflowRunner = new WorkflowRunner();

View file

@ -37,6 +37,8 @@ import {
WorkflowRunner, WorkflowRunner,
} from '../src'; } from '../src';
import config = require('../config'); import config = require('../config');
import { User } from '../src/databases/entities/User';
import { getInstanceOwner } from '../src/UserManagement/UserManagementHelper';
export class ExecuteBatch extends Command { export class ExecuteBatch extends Command {
static description = '\nExecutes multiple workflows once'; static description = '\nExecutes multiple workflows once';
@ -57,6 +59,8 @@ export class ExecuteBatch extends Command {
static executionTimeout = 3 * 60 * 1000; static executionTimeout = 3 * 60 * 1000;
static instanceOwner: User;
static examples = [ static examples = [
`$ n8n executeBatch`, `$ n8n executeBatch`,
`$ n8n executeBatch --concurrency=10 --skipList=/data/skipList.txt`, `$ n8n executeBatch --concurrency=10 --skipList=/data/skipList.txt`,
@ -279,6 +283,8 @@ export class ExecuteBatch extends Command {
// Wait till the database is ready // Wait till the database is ready
await startDbInitPromise; await startDbInitPromise;
ExecuteBatch.instanceOwner = await getInstanceOwner();
let allWorkflows; let allWorkflows;
const query = Db.collections.Workflow!.createQueryBuilder('workflows'); const query = Db.collections.Workflow!.createQueryBuilder('workflows');
@ -666,6 +672,7 @@ export class ExecuteBatch extends Command {
executionMode: 'cli', executionMode: 'cli',
startNodes: [startNode!.name], startNodes: [startNode!.name],
workflowData, workflowData,
userId: ExecuteBatch.instanceOwner.id,
}; };
const workflowRunner = new WorkflowRunner(); const workflowRunner = new WorkflowRunner();

View file

@ -1,3 +1,9 @@
/* eslint-disable no-restricted-syntax */
/* eslint-disable @typescript-eslint/no-shadow */
/* eslint-disable @typescript-eslint/no-unsafe-call */
/* eslint-disable no-await-in-loop */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
/* eslint-disable @typescript-eslint/no-unsafe-member-access */ /* eslint-disable @typescript-eslint/no-unsafe-member-access */
/* eslint-disable no-console */ /* eslint-disable no-console */
import { Command, flags } from '@oclif/command'; import { Command, flags } from '@oclif/command';
@ -9,15 +15,25 @@ import { LoggerProxy } from 'n8n-workflow';
import * as fs from 'fs'; import * as fs from 'fs';
import * as glob from 'fast-glob'; import * as glob from 'fast-glob';
import * as path from 'path'; import * as path from 'path';
import { EntityManager, getConnection } from 'typeorm';
import { getLogger } from '../../src/Logger'; import { getLogger } from '../../src/Logger';
import { Db } from '../../src'; import { Db } from '../../src';
import { User } from '../../src/databases/entities/User';
import { SharedCredentials } from '../../src/databases/entities/SharedCredentials';
import { Role } from '../../src/databases/entities/Role';
import { CredentialsEntity } from '../../src/databases/entities/CredentialsEntity';
const FIX_INSTRUCTION =
'Please fix the database by running ./packages/cli/bin/n8n user-management:reset';
export class ImportCredentialsCommand extends Command { export class ImportCredentialsCommand extends Command {
static description = 'Import credentials'; static description = 'Import credentials';
static examples = [ static examples = [
`$ n8n import:credentials --input=file.json`, '$ n8n import:credentials --input=file.json',
`$ n8n import:credentials --separate --input=backups/latest/`, '$ n8n import:credentials --separate --input=backups/latest/',
'$ n8n import:credentials --input=file.json --userId=1d64c3d2-85fe-4a83-a649-e446b07b3aae',
'$ n8n import:credentials --separate --input=backups/latest/ --userId=1d64c3d2-85fe-4a83-a649-e446b07b3aae',
]; ];
static flags = { static flags = {
@ -29,87 +45,161 @@ export class ImportCredentialsCommand extends Command {
separate: flags.boolean({ separate: flags.boolean({
description: 'Imports *.json files from directory provided by --input', description: 'Imports *.json files from directory provided by --input',
}), }),
userId: flags.string({
description: 'The ID of the user to assign the imported credentials to',
}),
}; };
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types ownerCredentialRole: Role;
async run() {
transactionManager: EntityManager;
async run(): Promise<void> {
const logger = getLogger(); const logger = getLogger();
LoggerProxy.init(logger); LoggerProxy.init(logger);
// eslint-disable-next-line @typescript-eslint/no-shadow
const { flags } = this.parse(ImportCredentialsCommand); const { flags } = this.parse(ImportCredentialsCommand);
if (!flags.input) { if (!flags.input) {
console.info(`An input file or directory with --input must be provided`); console.info('An input file or directory with --input must be provided');
return; return;
} }
if (flags.separate) { if (flags.separate) {
if (fs.existsSync(flags.input)) { if (fs.existsSync(flags.input)) {
if (!fs.lstatSync(flags.input).isDirectory()) { if (!fs.lstatSync(flags.input).isDirectory()) {
console.info(`The paramenter --input must be a directory`); console.info('The argument to --input must be a directory');
return; return;
} }
} }
} }
let totalImported = 0;
try { try {
await Db.init(); await Db.init();
await this.initOwnerCredentialRole();
const user = flags.userId ? await this.getAssignee(flags.userId) : await this.getOwner();
// Make sure the settings exist // Make sure the settings exist
await UserSettings.prepareUserSettings(); await UserSettings.prepareUserSettings();
let i;
const encryptionKey = await UserSettings.getEncryptionKey(); const encryptionKey = await UserSettings.getEncryptionKey();
if (encryptionKey === undefined) { if (encryptionKey === undefined) {
throw new Error('No encryption key got found to encrypt the credentials!'); throw new Error('No encryption key found to encrypt the credentials!');
} }
if (flags.separate) { if (flags.separate) {
const files = await glob( const files = await glob(
`${flags.input.endsWith(path.sep) ? flags.input : flags.input + path.sep}*.json`, `${flags.input.endsWith(path.sep) ? flags.input : flags.input + path.sep}*.json`,
); );
for (i = 0; i < files.length; i++) {
// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment
const credential = JSON.parse(fs.readFileSync(files[i], { encoding: 'utf8' }));
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access totalImported = files.length;
await getConnection().transaction(async (transactionManager) => {
this.transactionManager = transactionManager;
for (const file of files) {
const credential = JSON.parse(fs.readFileSync(file, { encoding: 'utf8' }));
if (typeof credential.data === 'object') {
// plain data / decrypted input. Should be encrypted first.
Credentials.prototype.setData.call(credential, credential.data, encryptionKey);
}
await this.storeCredential(credential, user);
}
});
this.reportSuccess(totalImported);
process.exit();
}
const credentials = JSON.parse(fs.readFileSync(flags.input, { encoding: 'utf8' }));
totalImported = credentials.length;
if (!Array.isArray(credentials)) {
throw new Error(
'File does not seem to contain credentials. Make sure the credentials are contained in an array.',
);
}
await getConnection().transaction(async (transactionManager) => {
this.transactionManager = transactionManager;
for (const credential of credentials) {
if (typeof credential.data === 'object') { if (typeof credential.data === 'object') {
// plain data / decrypted input. Should be encrypted first. // plain data / decrypted input. Should be encrypted first.
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
Credentials.prototype.setData.call(credential, credential.data, encryptionKey); Credentials.prototype.setData.call(credential, credential.data, encryptionKey);
} }
await this.storeCredential(credential, user);
// eslint-disable-next-line no-await-in-loop, @typescript-eslint/no-non-null-assertion
await Db.collections.Credentials!.save(credential);
} }
} else { });
// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment
const fileContents = JSON.parse(fs.readFileSync(flags.input, { encoding: 'utf8' }));
if (!Array.isArray(fileContents)) { this.reportSuccess(totalImported);
throw new Error(`File does not seem to contain credentials.`); process.exit();
}
for (i = 0; i < fileContents.length; i++) {
if (typeof fileContents[i].data === 'object') {
// plain data / decrypted input. Should be encrypted first.
Credentials.prototype.setData.call(
fileContents[i],
fileContents[i].data,
encryptionKey,
);
}
// eslint-disable-next-line no-await-in-loop, @typescript-eslint/no-non-null-assertion
await Db.collections.Credentials!.save(fileContents[i]);
}
}
console.info(`Successfully imported ${i} ${i === 1 ? 'credential.' : 'credentials.'}`);
process.exit(0);
} catch (error) { } catch (error) {
console.error('An error occurred while exporting credentials. See log messages for details.'); console.error('An error occurred while importing credentials. See log messages for details.');
logger.error(error.message); if (error instanceof Error) logger.error(error.message);
this.exit(1); this.exit(1);
} }
} }
private reportSuccess(total: number) {
console.info(`Successfully imported ${total} ${total === 1 ? 'workflow.' : 'workflows.'}`);
}
private async initOwnerCredentialRole() {
const ownerCredentialRole = await Db.collections.Role!.findOne({
where: { name: 'owner', scope: 'credential' },
});
if (!ownerCredentialRole) {
throw new Error(`Failed to find owner credential role. ${FIX_INSTRUCTION}`);
}
this.ownerCredentialRole = ownerCredentialRole;
}
private async storeCredential(credential: object, user: User) {
const newCredential = new CredentialsEntity();
Object.assign(newCredential, credential);
const savedCredential = await this.transactionManager.save<CredentialsEntity>(newCredential);
const newSharedCredential = new SharedCredentials();
Object.assign(newSharedCredential, {
credentials: savedCredential,
user,
role: this.ownerCredentialRole,
});
await this.transactionManager.save<SharedCredentials>(newSharedCredential);
}
private async getOwner() {
const ownerGlobalRole = await Db.collections.Role!.findOne({
where: { name: 'owner', scope: 'global' },
});
const owner = await Db.collections.User!.findOne({ globalRole: ownerGlobalRole });
if (!owner) {
throw new Error(`Failed to find owner. ${FIX_INSTRUCTION}`);
}
return owner;
}
private async getAssignee(userId: string) {
const user = await Db.collections.User!.findOne(userId);
if (!user) {
throw new Error(`Failed to find user with ID ${userId}`);
}
return user;
}
} }

View file

@ -1,3 +1,11 @@
/* eslint-disable no-restricted-syntax */
/* eslint-disable @typescript-eslint/restrict-template-expressions */
/* eslint-disable @typescript-eslint/no-shadow */
/* eslint-disable @typescript-eslint/no-loop-func */
/* eslint-disable no-await-in-loop */
/* eslint-disable @typescript-eslint/no-unsafe-member-access */
/* eslint-disable @typescript-eslint/no-unsafe-call */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable no-console */ /* eslint-disable no-console */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */ /* eslint-disable @typescript-eslint/no-unsafe-assignment */
import { Command, flags } from '@oclif/command'; import { Command, flags } from '@oclif/command';
@ -7,15 +15,25 @@ import { INode, INodeCredentialsDetails, LoggerProxy } from 'n8n-workflow';
import * as fs from 'fs'; import * as fs from 'fs';
import * as glob from 'fast-glob'; import * as glob from 'fast-glob';
import { UserSettings } from 'n8n-core'; import { UserSettings } from 'n8n-core';
import { EntityManager, getConnection } from 'typeorm';
import { getLogger } from '../../src/Logger'; import { getLogger } from '../../src/Logger';
import { Db, ICredentialsDb } from '../../src'; import { Db, ICredentialsDb } from '../../src';
import { SharedWorkflow } from '../../src/databases/entities/SharedWorkflow';
import { WorkflowEntity } from '../../src/databases/entities/WorkflowEntity';
import { Role } from '../../src/databases/entities/Role';
import { User } from '../../src/databases/entities/User';
const FIX_INSTRUCTION =
'Please fix the database by running ./packages/cli/bin/n8n user-management:reset';
export class ImportWorkflowsCommand extends Command { export class ImportWorkflowsCommand extends Command {
static description = 'Import workflows'; static description = 'Import workflows';
static examples = [ static examples = [
`$ n8n import:workflow --input=file.json`, '$ n8n import:workflow --input=file.json',
`$ n8n import:workflow --separate --input=backups/latest/`, '$ n8n import:workflow --separate --input=backups/latest/',
'$ n8n import:workflow --input=file.json --userId=1d64c3d2-85fe-4a83-a649-e446b07b3aae',
'$ n8n import:workflow --separate --input=backups/latest/ --userId=1d64c3d2-85fe-4a83-a649-e446b07b3aae',
]; ];
static flags = { static flags = {
@ -27,12 +45,174 @@ export class ImportWorkflowsCommand extends Command {
separate: flags.boolean({ separate: flags.boolean({
description: 'Imports *.json files from directory provided by --input', description: 'Imports *.json files from directory provided by --input',
}), }),
userId: flags.string({
description: 'The ID of the user to assign the imported workflows to',
}),
}; };
ownerWorkflowRole: Role;
transactionManager: EntityManager;
async run(): Promise<void> {
const logger = getLogger();
LoggerProxy.init(logger);
const { flags } = this.parse(ImportWorkflowsCommand);
if (!flags.input) {
console.info('An input file or directory with --input must be provided');
return;
}
if (flags.separate) {
if (fs.existsSync(flags.input)) {
if (!fs.lstatSync(flags.input).isDirectory()) {
console.info('The argument to --input must be a directory');
return;
}
}
}
try {
await Db.init();
await this.initOwnerWorkflowRole();
const user = flags.userId ? await this.getAssignee(flags.userId) : await this.getOwner();
// Make sure the settings exist
await UserSettings.prepareUserSettings();
const credentials = (await Db.collections.Credentials?.find()) ?? [];
let totalImported = 0;
if (flags.separate) {
let { input: inputPath } = flags;
if (process.platform === 'win32') {
inputPath = inputPath.replace(/\\/g, '/');
}
inputPath = inputPath.replace(/\/$/g, '');
const files = await glob(`${inputPath}/*.json`);
totalImported = files.length;
await getConnection().transaction(async (transactionManager) => {
this.transactionManager = transactionManager;
for (const file of files) {
const workflow = JSON.parse(fs.readFileSync(file, { encoding: 'utf8' }));
if (credentials.length > 0) {
workflow.nodes.forEach((node: INode) => {
this.transformCredentials(node, credentials);
});
}
await this.storeWorkflow(workflow, user);
}
});
this.reportSuccess(totalImported);
process.exit();
}
const workflows = JSON.parse(fs.readFileSync(flags.input, { encoding: 'utf8' }));
totalImported = workflows.length;
if (!Array.isArray(workflows)) {
throw new Error(
'File does not seem to contain workflows. Make sure the workflows are contained in an array.',
);
}
await getConnection().transaction(async (transactionManager) => {
this.transactionManager = transactionManager;
for (const workflow of workflows) {
if (credentials.length > 0) {
workflow.nodes.forEach((node: INode) => {
this.transformCredentials(node, credentials);
});
}
await this.storeWorkflow(workflow, user);
}
});
this.reportSuccess(totalImported);
process.exit();
} catch (error) {
console.error('An error occurred while importing workflows. See log messages for details.');
if (error instanceof Error) logger.error(error.message);
this.exit(1);
}
}
private reportSuccess(total: number) {
console.info(`Successfully imported ${total} ${total === 1 ? 'workflow.' : 'workflows.'}`);
}
private async initOwnerWorkflowRole() {
const ownerWorkflowRole = await Db.collections.Role!.findOne({
where: { name: 'owner', scope: 'workflow' },
});
if (!ownerWorkflowRole) {
throw new Error(`Failed to find owner workflow role. ${FIX_INSTRUCTION}`);
}
this.ownerWorkflowRole = ownerWorkflowRole;
}
private async storeWorkflow(workflow: object, user: User) {
const newWorkflow = new WorkflowEntity();
Object.assign(newWorkflow, workflow);
const savedWorkflow = await this.transactionManager.save<WorkflowEntity>(newWorkflow);
const newSharedWorkflow = new SharedWorkflow();
Object.assign(newSharedWorkflow, {
workflow: savedWorkflow,
user,
role: this.ownerWorkflowRole,
});
await this.transactionManager.save<SharedWorkflow>(newSharedWorkflow);
}
private async getOwner() {
const ownerGlobalRole = await Db.collections.Role!.findOne({
where: { name: 'owner', scope: 'global' },
});
const owner = await Db.collections.User!.findOne({ globalRole: ownerGlobalRole });
if (!owner) {
throw new Error(`Failed to find owner. ${FIX_INSTRUCTION}`);
}
return owner;
}
private async getAssignee(userId: string) {
const user = await Db.collections.User!.findOne(userId);
if (!user) {
throw new Error(`Failed to find user with ID ${userId}`);
}
return user;
}
private transformCredentials(node: INode, credentialsEntities: ICredentialsDb[]) { private transformCredentials(node: INode, credentialsEntities: ICredentialsDb[]) {
if (node.credentials) { if (node.credentials) {
const allNodeCredentials = Object.entries(node.credentials); const allNodeCredentials = Object.entries(node.credentials);
// eslint-disable-next-line no-restricted-syntax
for (const [type, name] of allNodeCredentials) { for (const [type, name] of allNodeCredentials) {
if (typeof name === 'string') { if (typeof name === 'string') {
const nodeCredentials: INodeCredentialsDetails = { const nodeCredentials: INodeCredentialsDetails = {
@ -54,80 +234,4 @@ export class ImportWorkflowsCommand extends Command {
} }
} }
} }
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
async run() {
const logger = getLogger();
LoggerProxy.init(logger);
// eslint-disable-next-line @typescript-eslint/no-shadow
const { flags } = this.parse(ImportWorkflowsCommand);
if (!flags.input) {
console.info(`An input file or directory with --input must be provided`);
return;
}
if (flags.separate) {
if (fs.existsSync(flags.input)) {
if (!fs.lstatSync(flags.input).isDirectory()) {
console.info(`The paramenter --input must be a directory`);
return;
}
}
}
try {
await Db.init();
// Make sure the settings exist
await UserSettings.prepareUserSettings();
const credentialsEntities = (await Db.collections.Credentials?.find()) ?? [];
let i;
if (flags.separate) {
let inputPath = flags.input;
if (process.platform === 'win32') {
inputPath = inputPath.replace(/\\/g, '/');
}
inputPath = inputPath.replace(/\/$/g, '');
const files = await glob(`${inputPath}/*.json`);
for (i = 0; i < files.length; i++) {
const workflow = JSON.parse(fs.readFileSync(files[i], { encoding: 'utf8' }));
if (credentialsEntities.length > 0) {
// eslint-disable-next-line
workflow.nodes.forEach((node: INode) => {
this.transformCredentials(node, credentialsEntities);
});
}
// eslint-disable-next-line no-await-in-loop, @typescript-eslint/no-non-null-assertion
await Db.collections.Workflow!.save(workflow);
}
} else {
const fileContents = JSON.parse(fs.readFileSync(flags.input, { encoding: 'utf8' }));
if (!Array.isArray(fileContents)) {
throw new Error(`File does not seem to contain workflows.`);
}
for (i = 0; i < fileContents.length; i++) {
if (credentialsEntities.length > 0) {
// eslint-disable-next-line
fileContents[i].nodes.forEach((node: INode) => {
this.transformCredentials(node, credentialsEntities);
});
}
// eslint-disable-next-line no-await-in-loop, @typescript-eslint/no-non-null-assertion
await Db.collections.Workflow!.save(fileContents[i]);
}
}
console.info(`Successfully imported ${i} ${i === 1 ? 'workflow.' : 'workflows.'}`);
process.exit(0);
} catch (error) {
console.error('An error occurred while exporting workflows. See log messages for details.');
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
logger.error(error.message);
this.exit(1);
}
}
} }

View file

@ -1,3 +1,4 @@
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable @typescript-eslint/await-thenable */ /* eslint-disable @typescript-eslint/await-thenable */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */ /* eslint-disable @typescript-eslint/no-unsafe-assignment */
/* eslint-disable @typescript-eslint/explicit-module-boundary-types */ /* eslint-disable @typescript-eslint/explicit-module-boundary-types */
@ -12,6 +13,7 @@ import { Command, flags } from '@oclif/command';
import * as Redis from 'ioredis'; import * as Redis from 'ioredis';
import { IDataObject, LoggerProxy } from 'n8n-workflow'; import { IDataObject, LoggerProxy } from 'n8n-workflow';
import { createHash } from 'crypto';
import * as config from '../config'; import * as config from '../config';
import { import {
ActiveExecutions, ActiveExecutions,
@ -31,6 +33,7 @@ import {
} from '../src'; } from '../src';
import { getLogger } from '../src/Logger'; import { getLogger } from '../src/Logger';
import { RESPONSE_ERROR_MESSAGES } from '../src/constants';
// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment, @typescript-eslint/no-var-requires // eslint-disable-next-line @typescript-eslint/no-unsafe-assignment, @typescript-eslint/no-var-requires
const open = require('open'); const open = require('open');
@ -166,6 +169,26 @@ export class Start extends Command {
// Make sure the settings exist // Make sure the settings exist
const userSettings = await UserSettings.prepareUserSettings(); const userSettings = await UserSettings.prepareUserSettings();
if (!config.get('userManagement.jwtSecret')) {
// If we don't have a JWT secret set, generate
// one based and save to config.
const encryptionKey = await UserSettings.getEncryptionKey();
if (!encryptionKey) {
throw new Error('Fatal error setting up user management: no encryption key set.');
}
// For a key off every other letter from encryption key
// CAREFUL: do not change this or it breaks all existing tokens.
let baseKey = '';
for (let i = 0; i < encryptionKey.length; i += 2) {
baseKey += encryptionKey[i];
}
config.set(
'userManagement.jwtSecret',
createHash('sha256').update(baseKey).digest('hex'),
);
}
// Load all node and credential types // Load all node and credential types
const loadNodesAndCredentials = LoadNodesAndCredentials(); const loadNodesAndCredentials = LoadNodesAndCredentials();
await loadNodesAndCredentials.init(); await loadNodesAndCredentials.init();
@ -187,6 +210,18 @@ export class Start extends Command {
// Wait till the database is ready // Wait till the database is ready
await startDbInitPromise; await startDbInitPromise;
const encryptionKey = await UserSettings.getEncryptionKey();
if (!encryptionKey) {
throw new Error(RESPONSE_ERROR_MESSAGES.NO_ENCRYPTION_KEY);
}
// Load settings from database and set them to config.
const databaseSettings = await Db.collections.Settings!.find({ loadOnStartup: true });
databaseSettings.forEach((setting) => {
config.set(setting.key, JSON.parse(setting.value));
});
if (config.get('executions.mode') === 'queue') { if (config.get('executions.mode') === 'queue') {
const redisHost = config.get('queue.bull.redis.host'); const redisHost = config.get('queue.bull.redis.host');
const redisPassword = config.get('queue.bull.redis.password'); const redisPassword = config.get('queue.bull.redis.password');
@ -319,6 +354,12 @@ export class Start extends Command {
const editorUrl = GenericHelpers.getBaseUrl(); const editorUrl = GenericHelpers.getBaseUrl();
this.log(`\nEditor is now accessible via:\n${editorUrl}`); this.log(`\nEditor is now accessible via:\n${editorUrl}`);
const saveManualExecutions = config.get('executions.saveDataManualExecutions') as boolean;
if (saveManualExecutions) {
this.log('\nManual executions will be visible only for the owner');
}
// Allow to open n8n editor by pressing "o" // Allow to open n8n editor by pressing "o"
if (Boolean(process.stdout.isTTY) && process.stdin.setRawMode) { if (Boolean(process.stdout.isTTY) && process.stdin.setRawMode) {
process.stdin.setRawMode(true); process.stdin.setRawMode(true);

View file

@ -0,0 +1,86 @@
/* eslint-disable no-console */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
import Command from '@oclif/command';
import { Not } from 'typeorm';
import { LoggerProxy } from 'n8n-workflow';
import { Db } from '../../src';
import { User } from '../../src/databases/entities/User';
import { getLogger } from '../../src/Logger';
export class Reset extends Command {
static description = '\nResets the database to the default user state';
private defaultUserProps = {
firstName: null,
lastName: null,
email: null,
password: null,
resetPasswordToken: null,
};
async run(): Promise<void> {
const logger = getLogger();
LoggerProxy.init(logger);
await Db.init();
try {
const owner = await this.getInstanceOwner();
const ownerWorkflowRole = await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'workflow',
});
const ownerCredentialRole = await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'credential',
});
await Db.collections.SharedWorkflow!.update(
{ user: { id: Not(owner.id) }, role: ownerWorkflowRole },
{ user: owner },
);
await Db.collections.SharedCredentials!.update(
{ user: { id: Not(owner.id) }, role: ownerCredentialRole },
{ user: owner },
);
await Db.collections.User!.delete({ id: Not(owner.id) });
await Db.collections.User!.save(Object.assign(owner, this.defaultUserProps));
await Db.collections.Settings!.update(
{ key: 'userManagement.isInstanceOwnerSetUp' },
{ value: 'false' },
);
await Db.collections.Settings!.update(
{ key: 'userManagement.skipInstanceOwnerSetup' },
{ value: 'false' },
);
} catch (error) {
console.error('Error resetting database. See log messages for details.');
if (error instanceof Error) logger.error(error.message);
this.exit(1);
}
console.info('Successfully reset the database to default user state.');
this.exit();
}
private async getInstanceOwner(): Promise<User> {
const globalRole = await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'global',
});
const owner = await Db.collections.User!.findOne({ globalRole });
if (owner) return owner;
const user = new User();
await Db.collections.User!.save(Object.assign(user, { ...this.defaultUserProps, globalRole }));
return Db.collections.User!.findOneOrFail({ globalRole });
}
}

View file

@ -41,6 +41,10 @@ import { getLogger } from '../src/Logger';
import * as config from '../config'; import * as config from '../config';
import * as Queue from '../src/Queue'; import * as Queue from '../src/Queue';
import {
checkPermissionsForExecution,
getWorkflowOwner,
} from '../src/UserManagement/UserManagementHelper';
export class Worker extends Command { export class Worker extends Command {
static description = '\nStarts a n8n worker'; static description = '\nStarts a n8n worker';
@ -123,6 +127,8 @@ export class Worker extends Command {
`Start job: ${job.id} (Workflow ID: ${currentExecutionDb.workflowData.id} | Execution: ${jobData.executionId})`, `Start job: ${job.id} (Workflow ID: ${currentExecutionDb.workflowData.id} | Execution: ${jobData.executionId})`,
); );
const workflowOwner = await getWorkflowOwner(currentExecutionDb.workflowData.id!.toString());
let { staticData } = currentExecutionDb.workflowData; let { staticData } = currentExecutionDb.workflowData;
if (jobData.loadStaticData) { if (jobData.loadStaticData) {
const findOptions = { const findOptions = {
@ -166,7 +172,10 @@ export class Worker extends Command {
settings: currentExecutionDb.workflowData.settings, settings: currentExecutionDb.workflowData.settings,
}); });
await checkPermissionsForExecution(workflow, workflowOwner.id);
const additionalData = await WorkflowExecuteAdditionalData.getBase( const additionalData = await WorkflowExecuteAdditionalData.getBase(
workflowOwner.id,
undefined, undefined,
executionTimeoutTimestamp, executionTimeoutTimestamp,
); );

View file

@ -351,6 +351,7 @@ const config = convict({
}, },
}, },
}, },
generic: { generic: {
// The timezone to use. Is important for nodes like "Cron" which start the // The timezone to use. Is important for nodes like "Cron" which start the
// workflow automatically at a specified time. This setting can also be // workflow automatically at a specified time. This setting can also be
@ -410,6 +411,12 @@ const config = convict({
env: 'N8N_SSL_CERT', env: 'N8N_SSL_CERT',
doc: 'SSL Cert for HTTPS Protocol', doc: 'SSL Cert for HTTPS Protocol',
}, },
editorBaseUrl: {
format: String,
default: '',
env: 'N8N_EDITOR_BASE_URL',
doc: 'Public URL where the editor is accessible. Also used for emails sent from n8n.',
},
security: { security: {
excludeEndpoints: { excludeEndpoints: {
@ -573,6 +580,90 @@ const config = convict({
}, },
}, },
workflowTagsDisabled: {
format: Boolean,
default: false,
env: 'N8N_WORKFLOW_TAGS_DISABLED',
doc: 'Disable worfklow tags.',
},
userManagement: {
disabled: {
doc: 'Disable user management and hide it completely.',
format: Boolean,
default: false,
env: 'N8N_USER_MANAGEMENT_DISABLED',
},
jwtSecret: {
doc: 'Set a specific JWT secret (optional - n8n can generate one)', // Generated @ start.ts
format: String,
default: '',
env: 'N8N_USER_MANAGEMENT_JWT_SECRET',
},
emails: {
mode: {
doc: 'How to send emails',
format: ['', 'smtp'],
default: 'smtp',
env: 'N8N_EMAIL_MODE',
},
smtp: {
host: {
doc: 'SMTP server host',
format: String, // e.g. 'smtp.gmail.com'
default: '',
env: 'N8N_SMTP_HOST',
},
port: {
doc: 'SMTP server port',
format: Number,
default: 465,
env: 'N8N_SMTP_PORT',
},
secure: {
doc: 'Whether or not to use SSL for SMTP',
format: Boolean,
default: true,
env: 'N8N_SMTP_SSL',
},
auth: {
user: {
doc: 'SMTP login username',
format: String, // e.g.'you@gmail.com'
default: '',
env: 'N8N_SMTP_USER',
},
pass: {
doc: 'SMTP login password',
format: String,
default: '',
env: 'N8N_SMTP_PASS',
},
},
sender: {
doc: 'How to display sender name',
format: String,
default: '',
env: 'N8N_SMTP_SENDER',
},
},
templates: {
invite: {
doc: 'Overrides default HTML template for inviting new people (use full path)',
format: String,
default: '',
env: 'N8N_UM_EMAIL_TEMPLATES_INVITE',
},
passwordReset: {
doc: 'Overrides default HTML template for resetting password (use full path)',
format: String,
default: '',
env: 'N8N_UM_EMAIL_TEMPLATES_PWRESET',
},
},
},
},
externalHookFiles: { externalHookFiles: {
doc: 'Files containing external hooks. Multiple files can be separated by colon (":")', doc: 'Files containing external hooks. Multiple files can be separated by colon (":")',
format: String, format: String,
@ -636,8 +727,8 @@ const config = convict({
logs: { logs: {
level: { level: {
doc: 'Log output level. Options are error, warn, info, verbose and debug.', doc: 'Log output level',
format: String, format: ['error', 'warn', 'info', 'verbose', 'debug'],
default: 'info', default: 'info',
env: 'N8N_LOG_LEVEL', env: 'N8N_LOG_LEVEL',
}, },
@ -713,10 +804,10 @@ const config = convict({
doc: 'Available modes of binary data storage, as comma separated strings', doc: 'Available modes of binary data storage, as comma separated strings',
}, },
mode: { mode: {
format: String, format: ['default', 'filesystem'],
default: 'default', default: 'default',
env: 'N8N_DEFAULT_BINARY_DATA_MODE', env: 'N8N_DEFAULT_BINARY_DATA_MODE',
doc: 'Storage mode for binary data, default | filesystem', doc: 'Storage mode for binary data',
}, },
localStoragePath: { localStoragePath: {
format: String, format: String,

View file

@ -0,0 +1,17 @@
module.exports = {
verbose: true,
transform: {
'^.+\\.ts?$': 'ts-jest',
},
testURL: 'http://localhost/',
testRegex: '(/__tests__/.*|(\\.|/)(test))\\.ts$',
testPathIgnorePatterns: ['/dist/', '/node_modules/'],
moduleFileExtensions: ['ts', 'js', 'json'],
globals: {
'ts-jest': {
isolatedModules: true,
},
},
globalTeardown: '<rootDir>/test/teardown.ts',
setupFiles: ['<rootDir>/test/setup.ts'],
};

View file

@ -19,7 +19,7 @@
"bin": "n8n" "bin": "n8n"
}, },
"scripts": { "scripts": {
"build": "tsc", "build": "tsc && cp -r ./src/UserManagement/email/templates ./dist/src/UserManagement/email",
"dev": "concurrently -k -n \"TypeScript,Node\" -c \"yellow.bold,cyan.bold\" \"npm run watch\" \"nodemon\"", "dev": "concurrently -k -n \"TypeScript,Node\" -c \"yellow.bold,cyan.bold\" \"npm run watch\" \"nodemon\"",
"format": "cd ../.. && node_modules/prettier/bin-prettier.js packages/cli/**/**.ts --write", "format": "cd ../.. && node_modules/prettier/bin-prettier.js packages/cli/**/**.ts --write",
"lint": "cd ../.. && node_modules/eslint/bin/eslint.js packages/cli", "lint": "cd ../.. && node_modules/eslint/bin/eslint.js packages/cli",
@ -29,7 +29,10 @@
"start": "run-script-os", "start": "run-script-os",
"start:default": "cd bin && ./n8n", "start:default": "cd bin && ./n8n",
"start:windows": "cd bin && n8n", "start:windows": "cd bin && n8n",
"test": "jest", "test": "npm run test:sqlite",
"test:sqlite": "export DB_TYPE=sqlite && jest",
"test:postgres": "export DB_TYPE=postgresdb && jest",
"test:mysql": "export DB_TYPE=mysqldb && jest",
"watch": "tsc --watch", "watch": "tsc --watch",
"typeorm": "ts-node ../../node_modules/typeorm/cli.js" "typeorm": "ts-node ../../node_modules/typeorm/cli.js"
}, },
@ -61,23 +64,29 @@
"@types/compression": "1.0.1", "@types/compression": "1.0.1",
"@types/connect-history-api-fallback": "^1.3.1", "@types/connect-history-api-fallback": "^1.3.1",
"@types/convict": "^4.2.1", "@types/convict": "^4.2.1",
"@types/cookie-parser": "^1.4.2",
"@types/dotenv": "^8.2.0", "@types/dotenv": "^8.2.0",
"@types/express": "^4.17.6", "@types/express": "^4.17.6",
"@types/jest": "^26.0.13", "@types/jest": "^27.4.0",
"@types/localtunnel": "^1.9.0", "@types/localtunnel": "^1.9.0",
"@types/lodash.get": "^4.4.6", "@types/lodash.get": "^4.4.6",
"@types/lodash.merge": "^4.6.6", "@types/lodash.merge": "^4.6.6",
"@types/node": "14.17.27", "@types/node": "14.17.27",
"@types/open": "^6.1.0", "@types/open": "^6.1.0",
"@types/parseurl": "^1.3.1", "@types/parseurl": "^1.3.1",
"@types/passport-jwt": "^3.0.6",
"@types/request-promise-native": "~1.0.15", "@types/request-promise-native": "~1.0.15",
"@types/superagent": "4.1.13",
"@types/supertest": "^2.0.11",
"@types/uuid": "^8.3.0",
"@types/validator": "^13.7.0", "@types/validator": "^13.7.0",
"axios": "^0.21.1", "axios": "^0.21.1",
"concurrently": "^5.1.0", "concurrently": "^5.1.0",
"jest": "^26.4.2", "jest": "^27.4.7",
"nodemon": "^2.0.2", "nodemon": "^2.0.2",
"run-script-os": "^1.0.7", "run-script-os": "^1.0.7",
"ts-jest": "^26.3.0", "supertest": "^6.2.2",
"ts-jest": "^27.1.3",
"ts-node": "^8.9.1", "ts-node": "^8.9.1",
"tslint": "^6.1.2", "tslint": "^6.1.2",
"typescript": "~4.3.5" "typescript": "~4.3.5"
@ -94,11 +103,14 @@
"body-parser-xml": "^2.0.3", "body-parser-xml": "^2.0.3",
"bull": "^3.19.0", "bull": "^3.19.0",
"callsites": "^3.1.0", "callsites": "^3.1.0",
"change-case": "^4.1.1",
"class-validator": "^0.13.1", "class-validator": "^0.13.1",
"client-oauth2": "^4.2.5", "client-oauth2": "^4.2.5",
"compression": "^1.7.4", "compression": "^1.7.4",
"connect-history-api-fallback": "^1.6.0", "connect-history-api-fallback": "^1.6.0",
"convict": "^6.0.1", "convict": "^6.0.1",
"cookie-parser": "^1.4.6",
"crypto-js": "^4.1.1",
"csrf": "^3.1.0", "csrf": "^3.1.0",
"dotenv": "^8.0.0", "dotenv": "^8.0.0",
"express": "^4.16.4", "express": "^4.16.4",
@ -117,33 +129,22 @@
"n8n-editor-ui": "~0.134.0", "n8n-editor-ui": "~0.134.0",
"n8n-nodes-base": "~0.165.0", "n8n-nodes-base": "~0.165.0",
"n8n-workflow": "~0.90.0", "n8n-workflow": "~0.90.0",
"nodemailer": "^6.7.1",
"oauth-1.0a": "^2.2.6", "oauth-1.0a": "^2.2.6",
"open": "^7.0.0", "open": "^7.0.0",
"p-cancelable": "^2.0.0", "p-cancelable": "^2.0.0",
"passport": "^0.5.0",
"passport-cookie": "^1.0.9",
"passport-jwt": "^4.0.0",
"pg": "^8.3.0", "pg": "^8.3.0",
"prom-client": "^13.1.0", "prom-client": "^13.1.0",
"request-promise-native": "^1.0.7", "request-promise-native": "^1.0.7",
"sqlite3": "^5.0.1", "sqlite3": "^5.0.2",
"sse-channel": "^3.1.1", "sse-channel": "^3.1.1",
"tslib": "1.14.1", "tslib": "1.14.1",
"typeorm": "0.2.30", "typeorm": "0.2.30",
"uuid": "^8.3.0",
"validator": "13.7.0",
"winston": "^3.3.3" "winston": "^3.3.3"
},
"jest": {
"transform": {
"^.+\\.tsx?$": "ts-jest"
},
"testURL": "http://localhost/",
"testRegex": "(/__tests__/.*|(\\.|/)(test|spec))\\.(jsx?|tsx?)$",
"testPathIgnorePatterns": [
"/dist/",
"/node_modules/"
],
"moduleFileExtensions": [
"ts",
"tsx",
"js",
"json"
]
} }
} }

View file

@ -1,3 +1,4 @@
/* eslint-disable import/no-cycle */
/* eslint-disable prefer-spread */ /* eslint-disable prefer-spread */
/* eslint-disable @typescript-eslint/no-non-null-assertion */ /* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable no-param-reassign */ /* eslint-disable no-param-reassign */
@ -48,6 +49,9 @@ import {
ExternalHooks, ExternalHooks,
} from '.'; } from '.';
import config = require('../config'); import config = require('../config');
import { User } from './databases/entities/User';
import { whereClause } from './WorkflowHelpers';
import { WorkflowEntity } from './databases/entities/WorkflowEntity';
const WEBHOOK_PROD_UNREGISTERED_HINT = `The workflow must be active for a production URL to run successfully. You can activate the workflow using the toggle in the top-right of the editor. Note that unlike test URL calls, production URL calls aren't shown on the canvas (only in the executions list)`; const WEBHOOK_PROD_UNREGISTERED_HINT = `The workflow must be active for a production URL to run successfully. You can activate the workflow using the toggle in the top-right of the editor. Note that unlike test URL calls, production URL calls aren't shown on the canvas (only in the executions list)`;
@ -66,7 +70,8 @@ export class ActiveWorkflowRunner {
// Here I guess we can have a flag on the workflow table like hasTrigger // Here I guess we can have a flag on the workflow table like hasTrigger
// so intead of pulling all the active wehhooks just pull the actives that have a trigger // so intead of pulling all the active wehhooks just pull the actives that have a trigger
const workflowsData: IWorkflowDb[] = (await Db.collections.Workflow!.find({ const workflowsData: IWorkflowDb[] = (await Db.collections.Workflow!.find({
active: true, where: { active: true },
relations: ['shared', 'shared.user', 'shared.user.globalRole'],
})) as IWorkflowDb[]; })) as IWorkflowDb[];
if (!config.get('endpoints.skipWebhoooksDeregistrationOnShutdown')) { if (!config.get('endpoints.skipWebhoooksDeregistrationOnShutdown')) {
@ -102,7 +107,7 @@ export class ActiveWorkflowRunner {
}); });
console.log(` => Started`); console.log(` => Started`);
} catch (error) { } catch (error) {
console.log(` => ERROR: Workflow could not be activated:`); console.log(` => ERROR: Workflow could not be activated`);
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions // eslint-disable-next-line @typescript-eslint/restrict-template-expressions
console.log(` ${error.message}`); console.log(` ${error.message}`);
Logger.error(`Unable to initialize workflow "${workflowData.name}" (startup)`, { Logger.error(`Unable to initialize workflow "${workflowData.name}" (startup)`, {
@ -251,7 +256,9 @@ export class ActiveWorkflowRunner {
}); });
} }
const workflowData = await Db.collections.Workflow!.findOne(webhook.workflowId); const workflowData = await Db.collections.Workflow!.findOne(webhook.workflowId, {
relations: ['shared', 'shared.user', 'shared.user.globalRole'],
});
if (workflowData === undefined) { if (workflowData === undefined) {
throw new ResponseHelper.ResponseError( throw new ResponseHelper.ResponseError(
`Could not find workflow with id "${webhook.workflowId}"`, `Could not find workflow with id "${webhook.workflowId}"`,
@ -272,7 +279,9 @@ export class ActiveWorkflowRunner {
settings: workflowData.settings, settings: workflowData.settings,
}); });
const additionalData = await WorkflowExecuteAdditionalData.getBase(); const additionalData = await WorkflowExecuteAdditionalData.getBase(
workflowData.shared[0].user.id,
);
const webhookData = NodeHelpers.getNodeWebhooks( const webhookData = NodeHelpers.getNodeWebhooks(
workflow, workflow,
@ -336,14 +345,30 @@ export class ActiveWorkflowRunner {
* @returns {string[]} * @returns {string[]}
* @memberof ActiveWorkflowRunner * @memberof ActiveWorkflowRunner
*/ */
async getActiveWorkflows(): Promise<IWorkflowDb[]> { async getActiveWorkflows(user?: User): Promise<IWorkflowDb[]> {
const activeWorkflows = (await Db.collections.Workflow?.find({ let activeWorkflows: WorkflowEntity[] = [];
where: { active: true },
select: ['id'], if (!user || user.globalRole.name === 'owner') {
})) as IWorkflowDb[]; activeWorkflows = await Db.collections.Workflow!.find({
return activeWorkflows.filter( select: ['id'],
(workflow) => this.activationErrors[workflow.id.toString()] === undefined, where: { active: true },
); });
} else {
const shared = await Db.collections.SharedWorkflow!.find({
relations: ['workflow'],
where: whereClause({
user,
entityType: 'workflow',
}),
});
activeWorkflows = shared.reduce<WorkflowEntity[]>((acc, cur) => {
if (cur.workflow.active) acc.push(cur.workflow);
return acc;
}, []);
}
return activeWorkflows.filter((workflow) => this.activationErrors[workflow.id] === undefined);
} }
/** /**
@ -354,8 +379,8 @@ export class ActiveWorkflowRunner {
* @memberof ActiveWorkflowRunner * @memberof ActiveWorkflowRunner
*/ */
async isActive(id: string): Promise<boolean> { async isActive(id: string): Promise<boolean> {
const workflow = (await Db.collections.Workflow?.findOne({ id: Number(id) })) as IWorkflowDb; const workflow = await Db.collections.Workflow!.findOne(id);
return workflow?.active; return !!workflow?.active;
} }
/** /**
@ -462,19 +487,14 @@ export class ActiveWorkflowRunner {
); );
} }
let errorMessage = '';
// if it's a workflow from the the insert // if it's a workflow from the the insert
// TODO check if there is standard error code for duplicate key violation that works // TODO check if there is standard error code for duplicate key violation that works
// with all databases // with all databases
if (error.name === 'QueryFailedError') { if (error.name === 'QueryFailedError') {
errorMessage = `The webhook path [${webhook.webhookPath}] and method [${webhook.method}] already exist.`; error.message = `The URL path that the "${webhook.node}" node uses is already taken. Please change it to something else.`;
} else if (error.detail) { } else if (error.detail) {
// it's a error runnig the webhook methods (checkExists, create) // it's a error runnig the webhook methods (checkExists, create)
errorMessage = error.detail; error.message = error.detail;
} else {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
errorMessage = error.message;
} }
throw error; throw error;
@ -492,7 +512,9 @@ export class ActiveWorkflowRunner {
* @memberof ActiveWorkflowRunner * @memberof ActiveWorkflowRunner
*/ */
async removeWorkflowWebhooks(workflowId: string): Promise<void> { async removeWorkflowWebhooks(workflowId: string): Promise<void> {
const workflowData = await Db.collections.Workflow!.findOne(workflowId); const workflowData = await Db.collections.Workflow!.findOne(workflowId, {
relations: ['shared', 'shared.user', 'shared.user.globalRole'],
});
if (workflowData === undefined) { if (workflowData === undefined) {
throw new Error(`Could not find workflow with id "${workflowId}"`); throw new Error(`Could not find workflow with id "${workflowId}"`);
} }
@ -511,7 +533,9 @@ export class ActiveWorkflowRunner {
const mode = 'internal'; const mode = 'internal';
const additionalData = await WorkflowExecuteAdditionalData.getBase(); const additionalData = await WorkflowExecuteAdditionalData.getBase(
workflowData.shared[0].user.id,
);
const webhooks = WebhookHelpers.getWorkflowWebhooks(workflow, additionalData, undefined, true); const webhooks = WebhookHelpers.getWorkflowWebhooks(workflow, additionalData, undefined, true);
@ -578,6 +602,7 @@ export class ActiveWorkflowRunner {
// Start the workflow // Start the workflow
const runData: IWorkflowExecutionDataProcess = { const runData: IWorkflowExecutionDataProcess = {
userId: additionalData.userId,
executionMode: mode, executionMode: mode,
executionData, executionData,
workflowData, workflowData,
@ -681,7 +706,9 @@ export class ActiveWorkflowRunner {
let workflowInstance: Workflow; let workflowInstance: Workflow;
try { try {
if (workflowData === undefined) { if (workflowData === undefined) {
workflowData = (await Db.collections.Workflow!.findOne(workflowId)) as IWorkflowDb; workflowData = (await Db.collections.Workflow!.findOne(workflowId, {
relations: ['shared', 'shared.user', 'shared.user.globalRole'],
})) as IWorkflowDb;
} }
if (!workflowData) { if (!workflowData) {
@ -710,7 +737,9 @@ export class ActiveWorkflowRunner {
} }
const mode = 'trigger'; const mode = 'trigger';
const additionalData = await WorkflowExecuteAdditionalData.getBase(); const additionalData = await WorkflowExecuteAdditionalData.getBase(
(workflowData as WorkflowEntity).shared[0].user.id,
);
const getTriggerFunctions = this.getExecuteTriggerFunctions( const getTriggerFunctions = this.getExecuteTriggerFunctions(
workflowData, workflowData,
additionalData, additionalData,

View file

@ -35,6 +35,7 @@ import {
Workflow, Workflow,
WorkflowExecuteMode, WorkflowExecuteMode,
ITaskDataConnections, ITaskDataConnections,
LoggerProxy as Logger,
} from 'n8n-workflow'; } from 'n8n-workflow';
// eslint-disable-next-line import/no-cycle // eslint-disable-next-line import/no-cycle
@ -44,8 +45,11 @@ import {
Db, Db,
ICredentialsDb, ICredentialsDb,
NodeTypes, NodeTypes,
WhereClause,
WorkflowExecuteAdditionalData, WorkflowExecuteAdditionalData,
} from '.'; } from '.';
// eslint-disable-next-line import/no-cycle
import { User } from './databases/entities/User';
const mockNodeTypes: INodeTypes = { const mockNodeTypes: INodeTypes = {
nodeTypes: {} as INodeTypeData, nodeTypes: {} as INodeTypeData,
@ -209,32 +213,40 @@ export class CredentialsHelper extends ICredentialsHelper {
/** /**
* Returns the credentials instance * Returns the credentials instance
* *
* @param {INodeCredentialsDetails} nodeCredentials id and name to return instance of * @param {INodeCredentialsDetails} nodeCredential id and name to return instance of
* @param {string} type Type of the credentials to return instance of * @param {string} type Type of the credential to return instance of
* @returns {Credentials} * @returns {Credentials}
* @memberof CredentialsHelper * @memberof CredentialsHelper
*/ */
async getCredentials( async getCredentials(
nodeCredentials: INodeCredentialsDetails, nodeCredential: INodeCredentialsDetails,
type: string, type: string,
userId?: string,
): Promise<Credentials> { ): Promise<Credentials> {
if (!nodeCredentials.id) { if (!nodeCredential.id) {
throw new Error(`Credentials "${nodeCredentials.name}" for type "${type}" don't have an ID.`); throw new Error(`Credential "${nodeCredential.name}" of type "${type}" has no ID.`);
} }
const credentials = await Db.collections.Credentials?.findOne({ id: nodeCredentials.id, type }); const credential = userId
? await Db.collections
.SharedCredentials!.findOneOrFail({
relations: ['credentials'],
where: { credentials: { id: nodeCredential.id, type }, user: { id: userId } },
})
.then((shared) => shared.credentials)
: await Db.collections.Credentials!.findOneOrFail({ id: nodeCredential.id, type });
if (!credentials) { if (!credential) {
throw new Error( throw new Error(
`Credentials with ID "${nodeCredentials.id}" don't exist for type "${type}".`, `Credential with ID "${nodeCredential.id}" does not exist for type "${type}".`,
); );
} }
return new Credentials( return new Credentials(
{ id: credentials.id.toString(), name: credentials.name }, { id: credential.id.toString(), name: credential.name },
credentials.type, credential.type,
credentials.nodesAccess, credential.nodesAccess,
credentials.data, credential.data,
); );
} }
@ -504,6 +516,7 @@ export class CredentialsHelper extends ICredentialsHelper {
} }
async testCredentials( async testCredentials(
user: User,
credentialType: string, credentialType: string,
credentialsDecrypted: ICredentialsDecrypted, credentialsDecrypted: ICredentialsDecrypted,
nodeToTestWith?: string, nodeToTestWith?: string,
@ -602,7 +615,7 @@ export class CredentialsHelper extends ICredentialsHelper {
}, },
}; };
const additionalData = await WorkflowExecuteAdditionalData.getBase(node.parameters); const additionalData = await WorkflowExecuteAdditionalData.getBase(user.id, node.parameters);
const routingNode = new RoutingNode( const routingNode = new RoutingNode(
workflow, workflow,
@ -656,7 +669,7 @@ export class CredentialsHelper extends ICredentialsHelper {
}; };
} }
} }
Logger.debug('Credential test failed', error);
return { return {
status: 'Error', status: 'Error',
message: error.message.toString(), message: error.message.toString(),
@ -669,3 +682,47 @@ export class CredentialsHelper extends ICredentialsHelper {
}; };
} }
} }
/**
* Build a `where` clause for a `find()` or `findOne()` operation
* in the `shared_workflow` or `shared_credentials` tables.
*/
export function whereClause({
user,
entityType,
entityId = '',
}: {
user: User;
entityType: 'workflow' | 'credentials';
entityId?: string;
}): WhereClause {
const where: WhereClause = entityId ? { [entityType]: { id: entityId } } : {};
if (user.globalRole.name !== 'owner') {
where.user = { id: user.id };
}
return where;
}
/**
* Get a credential if it has been shared with a user.
*/
export async function getCredentialForUser(
credentialId: string,
user: User,
): Promise<ICredentialsDb | null> {
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
const sharedCredential = await Db.collections.SharedCredentials!.findOne({
relations: ['credentials'],
where: whereClause({
user,
entityType: 'credentials',
entityId: credentialId,
}),
});
if (!sharedCredential) return null;
return sharedCredential.credentials as ICredentialsDb;
}

View file

@ -1,9 +1,19 @@
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */ /* eslint-disable @typescript-eslint/no-unsafe-assignment */
/* eslint-disable @typescript-eslint/restrict-template-expressions */ /* eslint-disable @typescript-eslint/restrict-template-expressions */
/* eslint-disable no-case-declarations */ /* eslint-disable no-case-declarations */
/* eslint-disable @typescript-eslint/naming-convention */ /* eslint-disable @typescript-eslint/naming-convention */
import { UserSettings } from 'n8n-core'; import { UserSettings } from 'n8n-core';
import { ConnectionOptions, createConnection, getRepository, LoggerOptions } from 'typeorm'; import {
Connection,
ConnectionOptions,
createConnection,
EntityManager,
EntityTarget,
getRepository,
LoggerOptions,
Repository,
} from 'typeorm';
import { TlsOptions } from 'tls'; import { TlsOptions } from 'tls';
import * as path from 'path'; import * as path from 'path';
// eslint-disable-next-line import/no-cycle // eslint-disable-next-line import/no-cycle
@ -24,9 +34,26 @@ export const collections: IDatabaseCollections = {
Workflow: null, Workflow: null,
Webhook: null, Webhook: null,
Tag: null, Tag: null,
Role: null,
User: null,
SharedCredentials: null,
SharedWorkflow: null,
Settings: null,
}; };
export async function init(): Promise<IDatabaseCollections> { let connection: Connection;
export async function transaction<T>(fn: (entityManager: EntityManager) => Promise<T>): Promise<T> {
return connection.transaction(fn);
}
export function linkRepository<Entity>(entityClass: EntityTarget<Entity>): Repository<Entity> {
return getRepository(entityClass, connection.name);
}
export async function init(
testConnectionOptions?: ConnectionOptions,
): Promise<IDatabaseCollections> {
const dbType = (await GenericHelpers.getConfigValue('database.type')) as DatabaseType; const dbType = (await GenericHelpers.getConfigValue('database.type')) as DatabaseType;
const n8nFolder = UserSettings.getUserN8nFolderPath(); const n8nFolder = UserSettings.getUserN8nFolderPath();
@ -34,74 +61,80 @@ export async function init(): Promise<IDatabaseCollections> {
const entityPrefix = config.get('database.tablePrefix'); const entityPrefix = config.get('database.tablePrefix');
switch (dbType) { if (testConnectionOptions) {
case 'postgresdb': connectionOptions = testConnectionOptions;
const sslCa = (await GenericHelpers.getConfigValue('database.postgresdb.ssl.ca')) as string; } else {
const sslCert = (await GenericHelpers.getConfigValue( switch (dbType) {
'database.postgresdb.ssl.cert', case 'postgresdb':
)) as string; const sslCa = (await GenericHelpers.getConfigValue('database.postgresdb.ssl.ca')) as string;
const sslKey = (await GenericHelpers.getConfigValue('database.postgresdb.ssl.key')) as string; const sslCert = (await GenericHelpers.getConfigValue(
const sslRejectUnauthorized = (await GenericHelpers.getConfigValue( 'database.postgresdb.ssl.cert',
'database.postgresdb.ssl.rejectUnauthorized', )) as string;
)) as boolean; const sslKey = (await GenericHelpers.getConfigValue(
'database.postgresdb.ssl.key',
)) as string;
const sslRejectUnauthorized = (await GenericHelpers.getConfigValue(
'database.postgresdb.ssl.rejectUnauthorized',
)) as boolean;
let ssl: TlsOptions | undefined; let ssl: TlsOptions | undefined;
if (sslCa !== '' || sslCert !== '' || sslKey !== '' || !sslRejectUnauthorized) { if (sslCa !== '' || sslCert !== '' || sslKey !== '' || !sslRejectUnauthorized) {
ssl = { ssl = {
ca: sslCa || undefined, ca: sslCa || undefined,
cert: sslCert || undefined, cert: sslCert || undefined,
key: sslKey || undefined, key: sslKey || undefined,
rejectUnauthorized: sslRejectUnauthorized, rejectUnauthorized: sslRejectUnauthorized,
};
}
connectionOptions = {
type: 'postgres',
entityPrefix,
database: (await GenericHelpers.getConfigValue('database.postgresdb.database')) as string,
host: (await GenericHelpers.getConfigValue('database.postgresdb.host')) as string,
password: (await GenericHelpers.getConfigValue('database.postgresdb.password')) as string,
port: (await GenericHelpers.getConfigValue('database.postgresdb.port')) as number,
username: (await GenericHelpers.getConfigValue('database.postgresdb.user')) as string,
schema: config.get('database.postgresdb.schema'),
migrations: postgresMigrations,
migrationsRun: true,
migrationsTableName: `${entityPrefix}migrations`,
ssl,
}; };
}
connectionOptions = { break;
type: 'postgres',
entityPrefix,
database: (await GenericHelpers.getConfigValue('database.postgresdb.database')) as string,
host: (await GenericHelpers.getConfigValue('database.postgresdb.host')) as string,
password: (await GenericHelpers.getConfigValue('database.postgresdb.password')) as string,
port: (await GenericHelpers.getConfigValue('database.postgresdb.port')) as number,
username: (await GenericHelpers.getConfigValue('database.postgresdb.user')) as string,
schema: config.get('database.postgresdb.schema'),
migrations: postgresMigrations,
migrationsRun: true,
migrationsTableName: `${entityPrefix}migrations`,
ssl,
};
break; case 'mariadb':
case 'mysqldb':
connectionOptions = {
type: dbType === 'mysqldb' ? 'mysql' : 'mariadb',
database: (await GenericHelpers.getConfigValue('database.mysqldb.database')) as string,
entityPrefix,
host: (await GenericHelpers.getConfigValue('database.mysqldb.host')) as string,
password: (await GenericHelpers.getConfigValue('database.mysqldb.password')) as string,
port: (await GenericHelpers.getConfigValue('database.mysqldb.port')) as number,
username: (await GenericHelpers.getConfigValue('database.mysqldb.user')) as string,
migrations: mysqlMigrations,
migrationsRun: true,
migrationsTableName: `${entityPrefix}migrations`,
timezone: 'Z', // set UTC as default
};
break;
case 'mariadb': case 'sqlite':
case 'mysqldb': connectionOptions = {
connectionOptions = { type: 'sqlite',
type: dbType === 'mysqldb' ? 'mysql' : 'mariadb', database: path.join(n8nFolder, 'database.sqlite'),
database: (await GenericHelpers.getConfigValue('database.mysqldb.database')) as string, entityPrefix,
entityPrefix, migrations: sqliteMigrations,
host: (await GenericHelpers.getConfigValue('database.mysqldb.host')) as string, migrationsRun: false, // migrations for sqlite will be ran manually for now; see below
password: (await GenericHelpers.getConfigValue('database.mysqldb.password')) as string, migrationsTableName: `${entityPrefix}migrations`,
port: (await GenericHelpers.getConfigValue('database.mysqldb.port')) as number, };
username: (await GenericHelpers.getConfigValue('database.mysqldb.user')) as string, break;
migrations: mysqlMigrations,
migrationsRun: true,
migrationsTableName: `${entityPrefix}migrations`,
timezone: 'Z', // set UTC as default
};
break;
case 'sqlite': default:
connectionOptions = { throw new Error(`The database "${dbType}" is currently not supported!`);
type: 'sqlite', }
database: path.join(n8nFolder, 'database.sqlite'),
entityPrefix,
migrations: sqliteMigrations,
migrationsRun: false, // migrations for sqlite will be ran manually for now; see below
migrationsTableName: `${entityPrefix}migrations`,
};
break;
default:
throw new Error(`The database "${dbType}" is currently not supported!`);
} }
let loggingOption: LoggerOptions = (await GenericHelpers.getConfigValue( let loggingOption: LoggerOptions = (await GenericHelpers.getConfigValue(
@ -129,9 +162,9 @@ export async function init(): Promise<IDatabaseCollections> {
)) as string, )) as string,
}); });
let connection = await createConnection(connectionOptions); connection = await createConnection(connectionOptions);
if (dbType === 'sqlite') { if (!testConnectionOptions && dbType === 'sqlite') {
// This specific migration changes database metadata. // This specific migration changes database metadata.
// A field is now nullable. We need to reconnect so that // A field is now nullable. We need to reconnect so that
// n8n knows it has changed. Happens only on sqlite. // n8n knows it has changed. Happens only on sqlite.
@ -157,11 +190,17 @@ export async function init(): Promise<IDatabaseCollections> {
} }
} }
collections.Credentials = getRepository(entities.CredentialsEntity); collections.Credentials = linkRepository(entities.CredentialsEntity);
collections.Execution = getRepository(entities.ExecutionEntity); collections.Execution = linkRepository(entities.ExecutionEntity);
collections.Workflow = getRepository(entities.WorkflowEntity); collections.Workflow = linkRepository(entities.WorkflowEntity);
collections.Webhook = getRepository(entities.WebhookEntity); collections.Webhook = linkRepository(entities.WebhookEntity);
collections.Tag = getRepository(entities.TagEntity); collections.Tag = linkRepository(entities.TagEntity);
collections.Role = linkRepository(entities.Role);
collections.User = linkRepository(entities.User);
collections.SharedCredentials = linkRepository(entities.SharedCredentials);
collections.SharedWorkflow = linkRepository(entities.SharedWorkflow);
collections.Settings = linkRepository(entities.Settings);
return collections; return collections;
} }

View file

@ -1,3 +1,4 @@
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-non-null-assertion */ /* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable @typescript-eslint/restrict-template-expressions */ /* eslint-disable @typescript-eslint/restrict-template-expressions */
/* eslint-disable @typescript-eslint/no-unsafe-return */ /* eslint-disable @typescript-eslint/no-unsafe-return */
@ -8,14 +9,18 @@ import * as express from 'express';
import { join as pathJoin } from 'path'; import { join as pathJoin } from 'path';
import { readFile as fsReadFile } from 'fs/promises'; import { readFile as fsReadFile } from 'fs/promises';
import { IDataObject } from 'n8n-workflow'; import { IDataObject } from 'n8n-workflow';
import { validate } from 'class-validator';
import * as config from '../config'; import * as config from '../config';
// eslint-disable-next-line import/no-cycle // eslint-disable-next-line import/no-cycle
import { Db, ICredentialsDb, IPackageVersions } from '.'; import { Db, ICredentialsDb, IPackageVersions, ResponseHelper } from '.';
// eslint-disable-next-line import/order // eslint-disable-next-line import/order
import { Like } from 'typeorm'; import { Like } from 'typeorm';
// eslint-disable-next-line import/no-cycle // eslint-disable-next-line import/no-cycle
import { WorkflowEntity } from './databases/entities/WorkflowEntity'; import { WorkflowEntity } from './databases/entities/WorkflowEntity';
import { CredentialsEntity } from './databases/entities/CredentialsEntity';
import { TagEntity } from './databases/entities/TagEntity';
import { User } from './databases/entities/User';
let versionCache: IPackageVersions | undefined; let versionCache: IPackageVersions | undefined;
@ -188,3 +193,23 @@ export async function generateUniqueName(
return { name: `${requestedName} ${maxSuffix + 1}` }; return { name: `${requestedName} ${maxSuffix + 1}` };
} }
export async function validateEntity(
entity: WorkflowEntity | CredentialsEntity | TagEntity | User,
): Promise<void> {
const errors = await validate(entity);
const errorMessages = errors
.reduce<string[]>((acc, cur) => {
if (!cur.constraints) return acc;
acc.push(...Object.values(cur.constraints));
return acc;
}, [])
.join(' | ');
if (errorMessages) {
throw new ResponseHelper.ResponseError(errorMessages, undefined, 400);
}
}
export const DEFAULT_EXECUTIONS_GET_ALL_LIMIT = 20;

View file

@ -29,6 +29,11 @@ import { Url } from 'url';
import { Request } from 'express'; import { Request } from 'express';
import { WorkflowEntity } from './databases/entities/WorkflowEntity'; import { WorkflowEntity } from './databases/entities/WorkflowEntity';
import { TagEntity } from './databases/entities/TagEntity'; import { TagEntity } from './databases/entities/TagEntity';
import { Role } from './databases/entities/Role';
import { User } from './databases/entities/User';
import { SharedCredentials } from './databases/entities/SharedCredentials';
import { SharedWorkflow } from './databases/entities/SharedWorkflow';
import { Settings } from './databases/entities/Settings';
export interface IActivationError { export interface IActivationError {
time: number; time: number;
@ -72,6 +77,11 @@ export interface IDatabaseCollections {
Workflow: Repository<WorkflowEntity> | null; Workflow: Repository<WorkflowEntity> | null;
Webhook: Repository<IWebhookDb> | null; Webhook: Repository<IWebhookDb> | null;
Tag: Repository<TagEntity> | null; Tag: Repository<TagEntity> | null;
Role: Repository<Role> | null;
User: Repository<User> | null;
SharedCredentials: Repository<SharedCredentials> | null;
SharedWorkflow: Repository<SharedWorkflow> | null;
Settings: Repository<Settings> | null;
} }
export interface IWebhookDb { export interface IWebhookDb {
@ -83,6 +93,16 @@ export interface IWebhookDb {
pathLength?: number; pathLength?: number;
} }
// ----------------------------------
// settings
// ----------------------------------
export interface ISettingsDb {
key: string;
value: string | boolean | IDataObject | number;
loadOnStartup: boolean;
}
// ---------------------------------- // ----------------------------------
// tags // tags
// ---------------------------------- // ----------------------------------
@ -313,6 +333,16 @@ export interface IDiagnosticInfo {
}; };
deploymentType: string; deploymentType: string;
binaryDataMode: string; binaryDataMode: string;
n8n_multi_user_allowed: boolean;
smtp_set_up: boolean;
}
export interface ITelemetryUserDeletionData {
user_id: string;
target_user_old_status: 'active' | 'invited';
migration_strategy?: 'transfer_data' | 'delete_data';
target_user_id?: string;
migration_user_id?: string;
} }
export interface IInternalHooksClass { export interface IInternalHooksClass {
@ -321,15 +351,29 @@ export interface IInternalHooksClass {
diagnosticInfo: IDiagnosticInfo, diagnosticInfo: IDiagnosticInfo,
firstWorkflowCreatedAt?: Date, firstWorkflowCreatedAt?: Date,
): Promise<unknown[]>; ): Promise<unknown[]>;
onPersonalizationSurveySubmitted(answers: IPersonalizationSurveyAnswers): Promise<void>; onPersonalizationSurveySubmitted(userId: string, answers: Record<string, string>): Promise<void>;
onWorkflowCreated(workflow: IWorkflowBase): Promise<void>; onWorkflowCreated(userId: string, workflow: IWorkflowBase): Promise<void>;
onWorkflowDeleted(workflowId: string): Promise<void>; onWorkflowDeleted(userId: string, workflowId: string): Promise<void>;
onWorkflowSaved(workflow: IWorkflowBase): Promise<void>; onWorkflowSaved(userId: string, workflow: IWorkflowBase): Promise<void>;
onWorkflowPostExecute( onWorkflowPostExecute(
executionId: string, executionId: string,
workflow: IWorkflowBase, workflow: IWorkflowBase,
runData?: IRun, runData?: IRun,
userId?: string,
): Promise<void>; ): Promise<void>;
onUserDeletion(userId: string, userDeletionData: ITelemetryUserDeletionData): Promise<void>;
onUserInvite(userInviteData: { user_id: string; target_user_id: string[] }): Promise<void>;
onUserReinvite(userReinviteData: { user_id: string; target_user_id: string }): Promise<void>;
onUserUpdate(userUpdateData: { user_id: string; fields_changed: string[] }): Promise<void>;
onUserInviteEmailClick(userInviteClickData: { user_id: string }): Promise<void>;
onUserPasswordResetEmailClick(userPasswordResetData: { user_id: string }): Promise<void>;
onUserTransactionalEmail(userTransactionalEmailData: {
user_id: string;
message_type: 'Reset password' | 'New user invite' | 'Resend invite';
}): Promise<void>;
onUserPasswordResetRequestClick(userPasswordResetData: { user_id: string }): Promise<void>;
onInstanceOwnerSetup(instanceOwnerSetupData: { user_id: string }): Promise<void>;
onUserSignup(userSignupData: { user_id: string }): Promise<void>;
} }
export interface IN8nConfig { export interface IN8nConfig {
@ -402,6 +446,7 @@ export interface IN8nUISettings {
}; };
timezone: string; timezone: string;
urlBaseWebhook: string; urlBaseWebhook: string;
urlBaseEditor: string;
versionCli: string; versionCli: string;
n8nMetadata?: { n8nMetadata?: {
[key: string]: string | number | undefined; [key: string]: string | number | undefined;
@ -409,8 +454,10 @@ export interface IN8nUISettings {
versionNotifications: IVersionNotificationSettings; versionNotifications: IVersionNotificationSettings;
instanceId: string; instanceId: string;
telemetry: ITelemetrySettings; telemetry: ITelemetrySettings;
personalizationSurvey: IPersonalizationSurvey; personalizationSurveyEnabled: boolean;
defaultLocale: string; defaultLocale: string;
userManagement: IUserManagementSettings;
workflowTagsDisabled: boolean;
logLevel: 'info' | 'debug' | 'warn' | 'error' | 'verbose'; logLevel: 'info' | 'debug' | 'warn' | 'error' | 'verbose';
hiringBannerEnabled: boolean; hiringBannerEnabled: boolean;
templates: { templates: {
@ -428,9 +475,10 @@ export interface IPersonalizationSurveyAnswers {
workArea: string[] | string | null; workArea: string[] | string | null;
} }
export interface IPersonalizationSurvey { export interface IUserManagementSettings {
answers?: IPersonalizationSurveyAnswers; enabled: boolean;
shouldShow: boolean; showSetupOnFirstLoad?: boolean;
smtpSetup: boolean;
} }
export interface IPackageVersions { export interface IPackageVersions {
@ -556,6 +604,7 @@ export interface IWorkflowExecutionDataProcess {
sessionId?: string; sessionId?: string;
startNodes?: string[]; startNodes?: string[];
workflowData: IWorkflowBase; workflowData: IWorkflowBase;
userId: string;
} }
export interface IWorkflowExecutionDataProcessWithExecution extends IWorkflowExecutionDataProcess { export interface IWorkflowExecutionDataProcessWithExecution extends IWorkflowExecutionDataProcess {
@ -563,6 +612,7 @@ export interface IWorkflowExecutionDataProcessWithExecution extends IWorkflowExe
credentialsTypeData: ICredentialsTypeData; credentialsTypeData: ICredentialsTypeData;
executionId: string; executionId: string;
nodeTypeData: ITransferNodeTypes; nodeTypeData: ITransferNodeTypes;
userId: string;
} }
export interface IWorkflowExecuteProcess { export interface IWorkflowExecuteProcess {
@ -570,3 +620,5 @@ export interface IWorkflowExecuteProcess {
workflow: Workflow; workflow: Workflow;
workflowExecute: WorkflowExecute; workflowExecute: WorkflowExecute;
} }
export type WhereClause = Record<string, { id: string }>;

View file

@ -1,10 +1,11 @@
/* eslint-disable import/no-cycle */ /* eslint-disable import/no-cycle */
import { BinaryDataManager } from 'n8n-core'; import { BinaryDataManager } from 'n8n-core';
import { IDataObject, INodeTypes, IRun, TelemetryHelpers } from 'n8n-workflow'; import { IDataObject, INodeTypes, IRun, TelemetryHelpers } from 'n8n-workflow';
import { snakeCase } from 'change-case';
import { import {
IDiagnosticInfo, IDiagnosticInfo,
IInternalHooksClass, IInternalHooksClass,
IPersonalizationSurveyAnswers, ITelemetryUserDeletionData,
IWorkflowBase, IWorkflowBase,
IWorkflowDb, IWorkflowDb,
} from '.'; } from '.';
@ -34,6 +35,8 @@ export class InternalHooksClass implements IInternalHooksClass {
execution_variables: diagnosticInfo.executionVariables, execution_variables: diagnosticInfo.executionVariables,
n8n_deployment_type: diagnosticInfo.deploymentType, n8n_deployment_type: diagnosticInfo.deploymentType,
n8n_binary_data_mode: diagnosticInfo.binaryDataMode, n8n_binary_data_mode: diagnosticInfo.binaryDataMode,
n8n_multi_user_allowed: diagnosticInfo.n8n_multi_user_allowed,
smtp_set_up: diagnosticInfo.smtp_set_up,
}; };
return Promise.all([ return Promise.all([
@ -45,41 +48,49 @@ export class InternalHooksClass implements IInternalHooksClass {
]); ]);
} }
async onPersonalizationSurveySubmitted(answers: IPersonalizationSurveyAnswers): Promise<void> { async onPersonalizationSurveySubmitted(
return this.telemetry.track('User responded to personalization questions', { userId: string,
company_size: answers.companySize, answers: Record<string, string>,
coding_skill: answers.codingSkill, ): Promise<void> {
work_area: answers.workArea, const camelCaseKeys = Object.keys(answers);
other_work_area: answers.otherWorkArea, const personalizationSurveyData = { user_id: userId } as Record<string, string | string[]>;
company_industry: answers.companyIndustry, camelCaseKeys.forEach((camelCaseKey) => {
other_company_industry: answers.otherCompanyIndustry, personalizationSurveyData[snakeCase(camelCaseKey)] = answers[camelCaseKey];
}); });
return this.telemetry.track(
'User responded to personalization questions',
personalizationSurveyData,
);
} }
async onWorkflowCreated(workflow: IWorkflowBase): Promise<void> { async onWorkflowCreated(userId: string, workflow: IWorkflowBase): Promise<void> {
const { nodeGraph } = TelemetryHelpers.generateNodesGraph(workflow, this.nodeTypes); const { nodeGraph } = TelemetryHelpers.generateNodesGraph(workflow, this.nodeTypes);
return this.telemetry.track('User created workflow', { return this.telemetry.track('User created workflow', {
user_id: userId,
workflow_id: workflow.id, workflow_id: workflow.id,
node_graph: nodeGraph, node_graph: nodeGraph,
node_graph_string: JSON.stringify(nodeGraph), node_graph_string: JSON.stringify(nodeGraph),
}); });
} }
async onWorkflowDeleted(workflowId: string): Promise<void> { async onWorkflowDeleted(userId: string, workflowId: string): Promise<void> {
return this.telemetry.track('User deleted workflow', { return this.telemetry.track('User deleted workflow', {
user_id: userId,
workflow_id: workflowId, workflow_id: workflowId,
}); });
} }
async onWorkflowSaved(workflow: IWorkflowDb): Promise<void> { async onWorkflowSaved(userId: string, workflow: IWorkflowDb): Promise<void> {
const { nodeGraph } = TelemetryHelpers.generateNodesGraph(workflow, this.nodeTypes); const { nodeGraph } = TelemetryHelpers.generateNodesGraph(workflow, this.nodeTypes);
return this.telemetry.track('User saved workflow', { return this.telemetry.track('User saved workflow', {
user_id: userId,
workflow_id: workflow.id, workflow_id: workflow.id,
node_graph: nodeGraph, node_graph: nodeGraph,
node_graph_string: JSON.stringify(nodeGraph), node_graph_string: JSON.stringify(nodeGraph),
version_cli: this.versionCli, version_cli: this.versionCli,
num_tags: workflow.tags.length, num_tags: workflow.tags?.length ?? 0,
}); });
} }
@ -87,6 +98,7 @@ export class InternalHooksClass implements IInternalHooksClass {
executionId: string, executionId: string,
workflow: IWorkflowBase, workflow: IWorkflowBase,
runData?: IRun, runData?: IRun,
userId?: string,
): Promise<void> { ): Promise<void> {
const promises = [Promise.resolve()]; const promises = [Promise.resolve()];
const properties: IDataObject = { const properties: IDataObject = {
@ -95,6 +107,10 @@ export class InternalHooksClass implements IInternalHooksClass {
version_cli: this.versionCli, version_cli: this.versionCli,
}; };
if (userId) {
properties.user_id = userId;
}
if (runData !== undefined) { if (runData !== undefined) {
properties.execution_mode = runData.mode; properties.execution_mode = runData.mode;
properties.success = !!runData.finished; properties.success = !!runData.finished;
@ -188,4 +204,72 @@ export class InternalHooksClass implements IInternalHooksClass {
return Promise.race([timeoutPromise, this.telemetry.trackN8nStop()]); return Promise.race([timeoutPromise, this.telemetry.trackN8nStop()]);
} }
async onUserDeletion(
userId: string,
userDeletionData: ITelemetryUserDeletionData,
): Promise<void> {
return this.telemetry.track('User deleted user', { ...userDeletionData, user_id: userId });
}
async onUserInvite(userInviteData: { user_id: string; target_user_id: string[] }): Promise<void> {
return this.telemetry.track('User invited new user', userInviteData);
}
async onUserReinvite(userReinviteData: {
user_id: string;
target_user_id: string;
}): Promise<void> {
return this.telemetry.track('User resent new user invite email', userReinviteData);
}
async onUserUpdate(userUpdateData: { user_id: string; fields_changed: string[] }): Promise<void> {
return this.telemetry.track('User changed personal settings', userUpdateData);
}
async onUserInviteEmailClick(userInviteClickData: { user_id: string }): Promise<void> {
return this.telemetry.track('User clicked invite link from email', userInviteClickData);
}
async onUserPasswordResetEmailClick(userPasswordResetData: { user_id: string }): Promise<void> {
return this.telemetry.track(
'User clicked password reset link from email',
userPasswordResetData,
);
}
async onUserTransactionalEmail(userTransactionalEmailData: {
user_id: string;
message_type: 'Reset password' | 'New user invite' | 'Resend invite';
}): Promise<void> {
return this.telemetry.track(
'Instance sent transactional email to user',
userTransactionalEmailData,
);
}
async onUserPasswordResetRequestClick(userPasswordResetData: { user_id: string }): Promise<void> {
return this.telemetry.track(
'User requested password reset while logged out',
userPasswordResetData,
);
}
async onInstanceOwnerSetup(instanceOwnerSetupData: { user_id: string }): Promise<void> {
return this.telemetry.track('Owner finished instance setup', instanceOwnerSetupData);
}
async onUserSignup(userSignupData: { user_id: string }): Promise<void> {
return this.telemetry.track('User signed up', userSignupData);
}
async onEmailFailed(failedEmailData: {
user_id: string;
message_type: 'Reset password' | 'New user invite' | 'Resend invite';
}): Promise<void> {
return this.telemetry.track(
'Instance failed to send transactional email to user',
failedEmailData,
);
}
} }

View file

@ -1,63 +0,0 @@
import { readFileSync, writeFile } from 'fs';
import { promisify } from 'util';
import { UserSettings } from 'n8n-core';
import * as config from '../config';
// eslint-disable-next-line import/no-cycle
import { Db, IPersonalizationSurvey, IPersonalizationSurveyAnswers } from '.';
const fsWriteFile = promisify(writeFile);
const PERSONALIZATION_SURVEY_FILENAME = 'personalizationSurvey.json';
function loadSurveyFromDisk(): IPersonalizationSurveyAnswers | undefined {
const userSettingsPath = UserSettings.getUserN8nFolderPath();
try {
const surveyFile = readFileSync(
`${userSettingsPath}/${PERSONALIZATION_SURVEY_FILENAME}`,
'utf-8',
);
return JSON.parse(surveyFile) as IPersonalizationSurveyAnswers;
} catch (error) {
return undefined;
}
}
export async function writeSurveyToDisk(
surveyAnswers: IPersonalizationSurveyAnswers,
): Promise<void> {
const userSettingsPath = UserSettings.getUserN8nFolderPath();
await fsWriteFile(
`${userSettingsPath}/${PERSONALIZATION_SURVEY_FILENAME}`,
JSON.stringify(surveyAnswers, null, '\t'),
);
}
export async function preparePersonalizationSurvey(): Promise<IPersonalizationSurvey> {
const survey: IPersonalizationSurvey = {
shouldShow: false,
};
survey.answers = loadSurveyFromDisk();
if (survey.answers) {
return survey;
}
const enabled =
(config.get('personalization.enabled') as boolean) &&
(config.get('diagnostics.enabled') as boolean);
if (!enabled) {
return survey;
}
const workflowsExist = !!(await Db.collections.Workflow?.findOne());
if (workflowsExist) {
return survey;
}
survey.shouldShow = true;
return survey;
}

View file

@ -1,3 +1,4 @@
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-explicit-any */ /* eslint-disable @typescript-eslint/no-explicit-any */
/* eslint-disable no-console */ /* eslint-disable no-console */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */ /* eslint-disable @typescript-eslint/no-unsafe-assignment */
@ -101,6 +102,8 @@ export function sendErrorResponse(res: Response, error: ResponseError, shouldLog
httpStatusCode = error.httpStatusCode; httpStatusCode = error.httpStatusCode;
} }
shouldLog = !process.argv[1].split('/').includes('jest');
if (process.env.NODE_ENV !== 'production' && shouldLog) { if (process.env.NODE_ENV !== 'production' && shouldLog) {
console.error('ERROR RESPONSE'); console.error('ERROR RESPONSE');
console.error(error); console.error(error);
@ -133,6 +136,9 @@ export function sendErrorResponse(res: Response, error: ResponseError, shouldLog
res.status(httpStatusCode).json(response); res.status(httpStatusCode).json(response);
} }
const isUniqueConstraintError = (error: Error) =>
['unique', 'duplicate'].some((s) => error.message.toLowerCase().includes(s));
/** /**
* A helper function which does not just allow to return Promises it also makes sure that * A helper function which does not just allow to return Promises it also makes sure that
* all the responses have the same format * all the responses have the same format
@ -148,10 +154,12 @@ export function send(processFunction: (req: Request, res: Response) => Promise<a
try { try {
const data = await processFunction(req, res); const data = await processFunction(req, res);
// Success response
sendSuccessResponse(res, data); sendSuccessResponse(res, data);
} catch (error) { } catch (error) {
// Error response if (error instanceof Error && isUniqueConstraintError(error)) {
error.message = 'There is already an entry with this name';
}
sendErrorResponse(res, error); sendErrorResponse(res, error);
} }
}; };

File diff suppressed because it is too large Load diff

View file

@ -2,9 +2,6 @@
/* eslint-disable @typescript-eslint/explicit-module-boundary-types */ /* eslint-disable @typescript-eslint/explicit-module-boundary-types */
/* eslint-disable import/no-cycle */ /* eslint-disable import/no-cycle */
import { getConnection } from 'typeorm'; import { getConnection } from 'typeorm';
import { validate } from 'class-validator';
import { ResponseHelper } from '.';
import { TagEntity } from './databases/entities/TagEntity'; import { TagEntity } from './databases/entities/TagEntity';
@ -15,43 +12,18 @@ import { ITagWithCountDb } from './Interfaces';
// ---------------------------------- // ----------------------------------
/** /**
* Sort a `TagEntity[]` by the order of the tag IDs in the incoming request. * Sort tags based on the order of the tag IDs in the request.
*/ */
export function sortByRequestOrder(tagsDb: TagEntity[], tagIds: string[]) { export function sortByRequestOrder(
const tagMap = tagsDb.reduce((acc, tag) => { tags: TagEntity[],
// @ts-ignore { requestOrder }: { requestOrder: string[] },
tag.id = tag.id.toString(); ) {
acc[tag.id] = tag; const tagMap = tags.reduce<Record<string, TagEntity>>((acc, tag) => {
acc[tag.id.toString()] = tag;
return acc; return acc;
}, {} as { [key: string]: TagEntity }); }, {});
return tagIds.map((tagId) => tagMap[tagId]); return requestOrder.map((tagId) => tagMap[tagId]);
}
// ----------------------------------
// validators
// ----------------------------------
/**
* Validate a new tag based on `class-validator` constraints.
*/
export async function validateTag(newTag: TagEntity) {
const errors = await validate(newTag);
if (errors.length) {
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
const validationErrorMessage = Object.values(errors[0].constraints!)[0];
throw new ResponseHelper.ResponseError(validationErrorMessage, undefined, 400);
}
}
export function throwDuplicateEntryError(error: Error) {
const errorMessage = error.message.toLowerCase();
if (errorMessage.includes('unique') || errorMessage.includes('duplicate')) {
throw new ResponseHelper.ResponseError('Tag name already exists', undefined, 400);
}
throw new ResponseHelper.ResponseError(errorMessage, undefined, 400);
} }
// ---------------------------------- // ----------------------------------

View file

@ -0,0 +1,40 @@
/* eslint-disable import/no-cycle */
import { Application } from 'express';
import { JwtFromRequestFunction } from 'passport-jwt';
import type { IExternalHooksClass, IPersonalizationSurveyAnswers } from '../Interfaces';
import { ActiveWorkflowRunner } from '..';
export interface JwtToken {
token: string;
expiresIn: number;
}
export interface JwtOptions {
secretOrKey: string;
jwtFromRequest: JwtFromRequestFunction;
}
export interface JwtPayload {
id: string;
email: string | null;
password: string | null;
}
export interface PublicUser {
id: string;
email?: string;
firstName?: string;
lastName?: string;
personalizationAnswers?: IPersonalizationSurveyAnswers | null;
password?: string;
passwordResetToken?: string;
isPending: boolean;
}
export interface N8nApp {
app: Application;
restEndpoint: string;
externalHooks: IExternalHooksClass;
defaultCredentialsName: string;
activeWorkflowRunner: ActiveWorkflowRunner.ActiveWorkflowRunner;
}

View file

@ -0,0 +1,218 @@
/* eslint-disable @typescript-eslint/no-unused-vars */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable import/no-cycle */
import { Workflow } from 'n8n-workflow';
import { In, IsNull, Not } from 'typeorm';
import express = require('express');
import { PublicUser } from './Interfaces';
import { Db, GenericHelpers, ResponseHelper } from '..';
import { MAX_PASSWORD_LENGTH, MIN_PASSWORD_LENGTH, User } from '../databases/entities/User';
import { Role } from '../databases/entities/Role';
import { AuthenticatedRequest } from '../requests';
import config = require('../../config');
import { getWebhookBaseUrl } from '../WebhookHelpers';
export async function getWorkflowOwner(workflowId: string | number): Promise<User> {
const sharedWorkflow = await Db.collections.SharedWorkflow!.findOneOrFail({
where: { workflow: { id: workflowId } },
relations: ['user', 'user.globalRole'],
});
return sharedWorkflow.user;
}
export function isEmailSetUp(): boolean {
const smtp = config.get('userManagement.emails.mode') === 'smtp';
const host = !!config.get('userManagement.emails.smtp.host');
const user = !!config.get('userManagement.emails.smtp.auth.user');
const pass = !!config.get('userManagement.emails.smtp.auth.pass');
return smtp && host && user && pass;
}
async function getInstanceOwnerRole(): Promise<Role> {
const ownerRole = await Db.collections.Role!.findOneOrFail({
where: {
name: 'owner',
scope: 'global',
},
});
return ownerRole;
}
export async function getInstanceOwner(): Promise<User> {
const ownerRole = await getInstanceOwnerRole();
const owner = await Db.collections.User!.findOneOrFail({
relations: ['globalRole'],
where: {
globalRole: ownerRole,
},
});
return owner;
}
/**
* Return the n8n instance base URL without trailing slash.
*/
export function getInstanceBaseUrl(): string {
const n8nBaseUrl = config.get('editorBaseUrl') || getWebhookBaseUrl();
return n8nBaseUrl.endsWith('/') ? n8nBaseUrl.slice(0, n8nBaseUrl.length - 1) : n8nBaseUrl;
}
export async function isInstanceOwnerSetup(): Promise<boolean> {
const users = await Db.collections.User!.find({ email: Not(IsNull()) });
return users.length !== 0;
}
// TODO: Enforce at model level
export function validatePassword(password?: string): string {
if (!password) {
throw new ResponseHelper.ResponseError('Password is mandatory', undefined, 400);
}
const hasInvalidLength =
password.length < MIN_PASSWORD_LENGTH || password.length > MAX_PASSWORD_LENGTH;
const hasNoNumber = !/\d/.test(password);
const hasNoUppercase = !/[A-Z]/.test(password);
if (hasInvalidLength || hasNoNumber || hasNoUppercase) {
const message: string[] = [];
if (hasInvalidLength) {
message.push(
`Password must be ${MIN_PASSWORD_LENGTH} to ${MAX_PASSWORD_LENGTH} characters long.`,
);
}
if (hasNoNumber) {
message.push('Password must contain at least 1 number.');
}
if (hasNoUppercase) {
message.push('Password must contain at least 1 uppercase letter.');
}
throw new ResponseHelper.ResponseError(message.join(' '), undefined, 400);
}
return password;
}
/**
* Remove sensitive properties from the user to return to the client.
*/
export function sanitizeUser(user: User, withoutKeys?: string[]): PublicUser {
const {
password,
resetPasswordToken,
resetPasswordTokenExpiration,
createdAt,
updatedAt,
...sanitizedUser
} = user;
if (withoutKeys) {
withoutKeys.forEach((key) => {
// @ts-ignore
delete sanitizedUser[key];
});
}
return sanitizedUser;
}
export async function getUserById(userId: string): Promise<User> {
const user = await Db.collections.User!.findOneOrFail(userId, {
relations: ['globalRole'],
});
return user;
}
export async function checkPermissionsForExecution(
workflow: Workflow,
userId: string,
): Promise<boolean> {
const credentialIds = new Set();
const nodeNames = Object.keys(workflow.nodes);
// Iterate over all nodes
nodeNames.forEach((nodeName) => {
const node = workflow.nodes[nodeName];
// And check if any of the nodes uses credentials.
if (node.credentials) {
const credentialNames = Object.keys(node.credentials);
// For every credential this node uses
credentialNames.forEach((credentialName) => {
const credentialDetail = node.credentials![credentialName];
// If it does not contain an id, it means it is a very old
// workflow. Nowaways it should not happen anymore.
// Migrations should handle the case where a credential does
// not have an id.
if (!credentialDetail.id) {
throw new Error(
'Error initializing workflow: credential ID not present. Please open the workflow and save it to fix this error.',
);
}
credentialIds.add(credentialDetail.id.toString());
});
}
});
// Now that we obtained all credential IDs used by this workflow, we can
// now check if the owner of this workflow has access to all of them.
const ids = Array.from(credentialIds);
if (ids.length === 0) {
// If the workflow does not use any credentials, then we're fine
return true;
}
// If this check happens on top, we may get
// unitialized db errors.
// Db is certainly initialized if workflow uses credentials.
const user = await getUserById(userId);
if (user.globalRole.name === 'owner') {
return true;
}
// Check for the user's permission to all used credentials
const credentialCount = await Db.collections.SharedCredentials!.count({
where: {
user: { id: userId },
credentials: In(ids),
},
});
// Considering the user needs to have access to all credentials
// then both arrays (allowed credentials vs used credentials)
// must be the same length
if (ids.length !== credentialCount) {
throw new Error('One or more of the used credentials are not accessable.');
}
return true;
}
/**
* Check if a URL contains an auth-excluded endpoint.
*/
export function isAuthExcluded(url: string, ignoredEndpoints: string[]): boolean {
return !!ignoredEndpoints
.filter(Boolean) // skip empty paths
.find((ignoredEndpoint) => url.includes(ignoredEndpoint));
}
/**
* Check if the endpoint is `POST /users/:id`.
*/
export function isPostUsersId(req: express.Request, restEndpoint: string): boolean {
return (
req.method === 'POST' &&
new RegExp(`/${restEndpoint}/users/[\\w\\d-]*`).test(req.url) &&
!req.url.includes('reinvite')
);
}
export function isAuthenticatedRequest(request: express.Request): request is AuthenticatedRequest {
return request.user !== undefined;
}

View file

@ -0,0 +1,67 @@
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable import/no-cycle */
import * as jwt from 'jsonwebtoken';
import { Response } from 'express';
import { createHash } from 'crypto';
import { Db } from '../..';
import { AUTH_COOKIE_NAME } from '../../constants';
import { JwtToken, JwtPayload } from '../Interfaces';
import { User } from '../../databases/entities/User';
import config = require('../../../config');
export function issueJWT(user: User): JwtToken {
const { id, email, password } = user;
const expiresIn = 7 * 86400000; // 7 days
const payload: JwtPayload = {
id,
email,
password: password ?? null,
};
if (password) {
payload.password = createHash('sha256')
.update(password.slice(password.length / 2))
.digest('hex');
}
const signedToken = jwt.sign(payload, config.get('userManagement.jwtSecret'), {
expiresIn: expiresIn / 1000 /* in seconds */,
});
return {
token: signedToken,
expiresIn,
};
}
export async function resolveJwtContent(jwtPayload: JwtPayload): Promise<User> {
const user = await Db.collections.User!.findOne(jwtPayload.id, {
relations: ['globalRole'],
});
let passwordHash = null;
if (user?.password) {
passwordHash = createHash('sha256')
.update(user.password.slice(user.password.length / 2))
.digest('hex');
}
if (!user || jwtPayload.password !== passwordHash || user.email !== jwtPayload.email) {
// When owner hasn't been set up, the default user
// won't have email nor password (both equals null)
throw new Error('Invalid token content');
}
return user;
}
export async function resolveJwt(token: string): Promise<User> {
const jwtPayload = jwt.verify(token, config.get('userManagement.jwtSecret')) as JwtPayload;
return resolveJwtContent(jwtPayload);
}
export async function issueCookie(res: Response, user: User): Promise<void> {
const userData = issueJWT(user);
res.cookie(AUTH_COOKIE_NAME, userData.token, { maxAge: userData.expiresIn, httpOnly: true });
}

View file

@ -0,0 +1,32 @@
export interface UserManagementMailerImplementation {
sendMail: (mailData: MailData) => Promise<SendEmailResult>;
verifyConnection: () => Promise<void>;
}
export type InviteEmailData = {
email: string;
firstName?: string;
lastName?: string;
inviteAcceptUrl: string;
domain: string;
};
export type PasswordResetData = {
email: string;
firstName?: string;
lastName?: string;
passwordResetUrl: string;
domain: string;
};
export type SendEmailResult = {
success: boolean;
error?: Error;
};
export type MailData = {
body: string | Buffer;
emailRecipients: string | string[];
subject: string;
textOnly?: string;
};

View file

@ -0,0 +1,74 @@
/* eslint-disable @typescript-eslint/no-unsafe-assignment */
import { createTransport, Transporter } from 'nodemailer';
import { LoggerProxy as Logger } from 'n8n-workflow';
import config = require('../../../config');
import { MailData, SendEmailResult, UserManagementMailerImplementation } from './Interfaces';
export class NodeMailer implements UserManagementMailerImplementation {
private transport: Transporter;
constructor() {
this.transport = createTransport({
host: config.get('userManagement.emails.smtp.host'),
port: config.get('userManagement.emails.smtp.port'),
secure: config.get('userManagement.emails.smtp.secure'),
auth: {
user: config.get('userManagement.emails.smtp.auth.user'),
pass: config.get('userManagement.emails.smtp.auth.pass'),
},
});
}
async verifyConnection(): Promise<void> {
const host = config.get('userManagement.emails.smtp.host') as string;
const user = config.get('userManagement.emails.smtp.auth.user') as string;
const pass = config.get('userManagement.emails.smtp.auth.pass') as string;
return new Promise((resolve, reject) => {
this.transport.verify((error: Error) => {
if (!error) {
resolve();
return;
}
const message = [];
if (!host) message.push('SMTP host not defined (N8N_SMTP_HOST).');
if (!user) message.push('SMTP user not defined (N8N_SMTP_USER).');
if (!pass) message.push('SMTP pass not defined (N8N_SMTP_PASS).');
reject(new Error(message.length ? message.join(' ') : error.message));
});
});
}
async sendMail(mailData: MailData): Promise<SendEmailResult> {
let sender = config.get('userManagement.emails.smtp.sender');
const user = config.get('userManagement.emails.smtp.auth.user') as string;
if (!sender && user.includes('@')) {
sender = user;
}
try {
await this.transport.sendMail({
from: sender,
to: mailData.emailRecipients,
subject: mailData.subject,
text: mailData.textOnly,
html: mailData.body,
});
Logger.verbose(
`Email sent successfully to the following recipients: ${mailData.emailRecipients.toString()}`,
);
} catch (error) {
Logger.error('Failed to send email', { recipients: mailData.emailRecipients, error });
return {
success: false,
error,
};
}
return { success: true };
}
}

View file

@ -0,0 +1,98 @@
/* eslint-disable import/no-cycle */
import { existsSync, readFileSync } from 'fs';
import { IDataObject } from 'n8n-workflow';
import { join as pathJoin } from 'path';
import { GenericHelpers } from '../..';
import config = require('../../../config');
import {
InviteEmailData,
PasswordResetData,
SendEmailResult,
UserManagementMailerImplementation,
} from './Interfaces';
import { NodeMailer } from './NodeMailer';
async function getTemplate(configKeyName: string, defaultFilename: string) {
const templateOverride = (await GenericHelpers.getConfigValue(
`userManagement.emails.templates.${configKeyName}`,
)) as string;
let template;
if (templateOverride && existsSync(templateOverride)) {
template = readFileSync(templateOverride, {
encoding: 'utf-8',
});
} else {
template = readFileSync(pathJoin(__dirname, `templates/${defaultFilename}`), {
encoding: 'utf-8',
});
}
return template;
}
function replaceStrings(template: string, data: IDataObject) {
let output = template;
const keys = Object.keys(data);
keys.forEach((key) => {
const regex = new RegExp(`\\{\\{\\s*${key}\\s*\\}\\}`, 'g');
output = output.replace(regex, data[key] as string);
});
return output;
}
export class UserManagementMailer {
private mailer: UserManagementMailerImplementation | undefined;
constructor() {
// Other implementations can be used in the future.
if (config.get('userManagement.emails.mode') === 'smtp') {
this.mailer = new NodeMailer();
}
}
async verifyConnection(): Promise<void> {
if (!this.mailer) return Promise.reject();
return this.mailer.verifyConnection();
}
async invite(inviteEmailData: InviteEmailData): Promise<SendEmailResult> {
let template = await getTemplate('invite', 'invite.html');
template = replaceStrings(template, inviteEmailData);
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
const result = await this.mailer?.sendMail({
emailRecipients: inviteEmailData.email,
subject: 'You have been invited to n8n',
body: template,
});
// If mailer does not exist it means mail has been disabled.
return result ?? { success: true };
}
async passwordReset(passwordResetData: PasswordResetData): Promise<SendEmailResult> {
let template = await getTemplate('passwordReset', 'passwordReset.html');
template = replaceStrings(template, passwordResetData);
// eslint-disable-next-line @typescript-eslint/no-non-null-assertion
const result = await this.mailer?.sendMail({
emailRecipients: passwordResetData.email,
subject: 'n8n password reset',
body: template,
});
// If mailer does not exist it means mail has been disabled.
return result ?? { success: true };
}
}
let mailerInstance: UserManagementMailer | undefined;
export async function getInstance(): Promise<UserManagementMailer> {
if (mailerInstance === undefined) {
mailerInstance = new UserManagementMailer();
await mailerInstance.verifyConnection();
}
return mailerInstance;
}

View file

@ -0,0 +1,4 @@
/* eslint-disable import/no-cycle */
import { getInstance, UserManagementMailer } from './UserManagementMailer';
export { getInstance, UserManagementMailer };

View file

@ -0,0 +1,5 @@
<h1>Hi there!</h1>
<p>Welcome to n8n, {{firstName}} {{lastName}}</p>
<p>Your instance is set up!</p>
<p>Use your email to login: {{email}} and the chosen password.</p>
<p>Have fun automating!</p>

View file

@ -0,0 +1,4 @@
<p>Hi there,</p>
<p>You have been invited to join n8n ({{ domain }}).</p>
<p>To accept, click the following link:</p>
<p><a href="{{ inviteAcceptUrl }}" target="_blank">{{ inviteAcceptUrl }}</a></p>

View file

@ -0,0 +1,5 @@
<p>Hi {{firstName}},</p>
<p>Somebody asked to reset your password on n8n ({{ domain }}).</p>
<p>If it was not you, you can safely ignore this email.</p>
<p>Click the following link to choose a new password. The link is valid for 2 hours.</p>
<a href="{{ passwordResetUrl }}">{{ passwordResetUrl }}</a>

View file

@ -0,0 +1,4 @@
/* eslint-disable import/no-cycle */
import { addRoutes } from './routes';
export const userManagementRouter = { addRoutes };

View file

@ -0,0 +1,119 @@
/* eslint-disable @typescript-eslint/naming-convention */
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable @typescript-eslint/no-unsafe-member-access */
import { Request, Response } from 'express';
import { compare } from 'bcryptjs';
import { IDataObject } from 'n8n-workflow';
import { Db, ResponseHelper } from '../..';
import { AUTH_COOKIE_NAME } from '../../constants';
import { issueCookie, resolveJwt } from '../auth/jwt';
import { N8nApp, PublicUser } from '../Interfaces';
import { isInstanceOwnerSetup, sanitizeUser } from '../UserManagementHelper';
import { User } from '../../databases/entities/User';
import type { LoginRequest } from '../../requests';
export function authenticationMethods(this: N8nApp): void {
/**
* Log in a user.
*
* Authless endpoint.
*/
this.app.post(
`/${this.restEndpoint}/login`,
ResponseHelper.send(async (req: LoginRequest, res: Response): Promise<PublicUser> => {
if (!req.body.email) {
throw new Error('Email is required to log in');
}
if (!req.body.password) {
throw new Error('Password is required to log in');
}
let user;
try {
user = await Db.collections.User!.findOne(
{
email: req.body.email,
},
{
relations: ['globalRole'],
},
);
} catch (error) {
throw new Error('Unable to access database.');
}
if (!user || !user.password || !(await compare(req.body.password, user.password))) {
// password is empty until user signs up
const error = new Error('Wrong username or password. Do you have caps lock on?');
// @ts-ignore
error.httpStatusCode = 401;
throw error;
}
await issueCookie(res, user);
return sanitizeUser(user);
}),
);
/**
* Manually check the `n8n-auth` cookie.
*/
this.app.get(
`/${this.restEndpoint}/login`,
ResponseHelper.send(async (req: Request, res: Response): Promise<PublicUser> => {
// Manually check the existing cookie.
const cookieContents = req.cookies?.[AUTH_COOKIE_NAME] as string | undefined;
let user: User;
if (cookieContents) {
// If logged in, return user
try {
user = await resolveJwt(cookieContents);
return sanitizeUser(user);
} catch (error) {
res.clearCookie(AUTH_COOKIE_NAME);
}
}
if (await isInstanceOwnerSetup()) {
const error = new Error('Not logged in');
// @ts-ignore
error.httpStatusCode = 401;
throw error;
}
try {
user = await Db.collections.User!.findOneOrFail({ relations: ['globalRole'] });
} catch (error) {
throw new Error(
'No users found in database - did you wipe the users table? Create at least one user.',
);
}
if (user.email || user.password) {
throw new Error('Invalid database state - user has password set.');
}
await issueCookie(res, user);
return sanitizeUser(user);
}),
);
/**
* Log out a user.
*
* Authless endpoint.
*/
this.app.post(
`/${this.restEndpoint}/logout`,
ResponseHelper.send(async (_, res: Response): Promise<IDataObject> => {
res.clearCookie(AUTH_COOKIE_NAME);
return {
loggedOut: true,
};
}),
);
}

View file

@ -0,0 +1,126 @@
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable @typescript-eslint/no-unsafe-call */
/* eslint-disable @typescript-eslint/no-unsafe-return */
/* eslint-disable import/no-cycle */
import cookieParser = require('cookie-parser');
import * as passport from 'passport';
import { Strategy } from 'passport-jwt';
import { NextFunction, Request, Response } from 'express';
import * as jwt from 'jsonwebtoken';
import { LoggerProxy as Logger } from 'n8n-workflow';
import { JwtPayload, N8nApp } from '../Interfaces';
import { authenticationMethods } from './auth';
import config = require('../../../config');
import { AUTH_COOKIE_NAME } from '../../constants';
import { issueCookie, resolveJwtContent } from '../auth/jwt';
import { meNamespace } from './me';
import { usersNamespace } from './users';
import { passwordResetNamespace } from './passwordReset';
import { AuthenticatedRequest } from '../../requests';
import { ownerNamespace } from './owner';
import { isAuthExcluded, isPostUsersId, isAuthenticatedRequest } from '../UserManagementHelper';
export function addRoutes(this: N8nApp, ignoredEndpoints: string[], restEndpoint: string): void {
// needed for testing; not adding overhead since it directly returns if req.cookies exists
this.app.use(cookieParser());
const options = {
jwtFromRequest: (req: Request) => {
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
return (req.cookies?.[AUTH_COOKIE_NAME] as string | undefined) ?? null;
},
secretOrKey: config.get('userManagement.jwtSecret') as string,
};
passport.use(
new Strategy(options, async function validateCookieContents(jwtPayload: JwtPayload, done) {
try {
const user = await resolveJwtContent(jwtPayload);
return done(null, user);
} catch (error) {
Logger.debug('Failed to extract user from JWT payload', { jwtPayload });
return done(null, false, { message: 'User not found' });
}
}),
);
this.app.use(passport.initialize());
this.app.use((req: Request, res: Response, next: NextFunction) => {
if (
// TODO: refactor me!!!
// skip authentication for preflight requests
req.method === 'OPTIONS' ||
req.url === '/index.html' ||
req.url === '/favicon.ico' ||
req.url.startsWith('/css/') ||
req.url.startsWith('/js/') ||
req.url.startsWith('/fonts/') ||
req.url.includes('.svg') ||
req.url.startsWith(`/${restEndpoint}/settings`) ||
req.url.includes('login') ||
req.url.includes('logout') ||
req.url.startsWith(`/${restEndpoint}/resolve-signup-token`) ||
isPostUsersId(req, restEndpoint) ||
req.url.startsWith(`/${restEndpoint}/forgot-password`) ||
req.url.startsWith(`/${restEndpoint}/resolve-password-token`) ||
req.url.startsWith(`/${restEndpoint}/change-password`) ||
isAuthExcluded(req.url, ignoredEndpoints)
) {
return next();
}
return passport.authenticate('jwt', { session: false })(req, res, next);
});
this.app.use((req: Request | AuthenticatedRequest, res: Response, next: NextFunction) => {
// req.user is empty for public routes, so just proceed
// owner can do anything, so proceed as well
if (!req.user || (isAuthenticatedRequest(req) && req.user.globalRole.name === 'owner')) {
next();
return;
}
// Not owner and user exists. We now protect restricted urls.
const postRestrictedUrls = [`/${this.restEndpoint}/users`, `/${this.restEndpoint}/owner`];
const getRestrictedUrls = [`/${this.restEndpoint}/users`];
const trimmedUrl = req.url.endsWith('/') ? req.url.slice(0, -1) : req.url;
if (
(req.method === 'POST' && postRestrictedUrls.includes(trimmedUrl)) ||
(req.method === 'GET' && getRestrictedUrls.includes(trimmedUrl)) ||
(req.method === 'DELETE' &&
new RegExp(`/${restEndpoint}/users/[^/]+`, 'gm').test(trimmedUrl)) ||
(req.method === 'POST' &&
new RegExp(`/${restEndpoint}/users/[^/]+/reinvite`, 'gm').test(trimmedUrl)) ||
new RegExp(`/${restEndpoint}/owner/[^/]+`, 'gm').test(trimmedUrl)
) {
Logger.verbose('User attempted to access endpoint without authorization', {
endpoint: `${req.method} ${trimmedUrl}`,
userId: isAuthenticatedRequest(req) ? req.user.id : 'unknown',
});
res.status(403).json({ status: 'error', message: 'Unauthorized' });
return;
}
next();
});
// middleware to refresh cookie before it expires
this.app.use(async (req: AuthenticatedRequest, res: Response, next: NextFunction) => {
const cookieAuth = options.jwtFromRequest(req);
if (cookieAuth && req.user) {
const cookieContents = jwt.decode(cookieAuth) as JwtPayload & { exp: number };
if (cookieContents.exp * 1000 - Date.now() < 259200000) {
// if cookie expires in < 3 days, renew it.
await issueCookie(res, req.user);
}
}
next();
});
authenticationMethods.apply(this);
ownerNamespace.apply(this);
meNamespace.apply(this);
passwordResetNamespace.apply(this);
usersNamespace.apply(this);
}

View file

@ -0,0 +1,154 @@
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable import/no-cycle */
import { compare, genSaltSync, hashSync } from 'bcryptjs';
import express = require('express');
import validator from 'validator';
import { LoggerProxy as Logger } from 'n8n-workflow';
import { Db, InternalHooksManager, ResponseHelper } from '../..';
import { issueCookie } from '../auth/jwt';
import { N8nApp, PublicUser } from '../Interfaces';
import { validatePassword, sanitizeUser } from '../UserManagementHelper';
import type { AuthenticatedRequest, MeRequest } from '../../requests';
import { validateEntity } from '../../GenericHelpers';
import { User } from '../../databases/entities/User';
export function meNamespace(this: N8nApp): void {
/**
* Return the logged-in user.
*/
this.app.get(
`/${this.restEndpoint}/me`,
ResponseHelper.send(async (req: AuthenticatedRequest): Promise<PublicUser> => {
return sanitizeUser(req.user);
}),
);
/**
* Update the logged-in user's settings, except password.
*/
this.app.patch(
`/${this.restEndpoint}/me`,
ResponseHelper.send(
async (req: MeRequest.Settings, res: express.Response): Promise<PublicUser> => {
if (!req.body.email) {
Logger.debug('Request to update user email failed because of missing email in payload', {
userId: req.user.id,
payload: req.body,
});
throw new ResponseHelper.ResponseError('Email is mandatory', undefined, 400);
}
if (!validator.isEmail(req.body.email)) {
Logger.debug('Request to update user email failed because of invalid email in payload', {
userId: req.user.id,
invalidEmail: req.body.email,
});
throw new ResponseHelper.ResponseError('Invalid email address', undefined, 400);
}
const newUser = new User();
Object.assign(newUser, req.user, req.body);
await validateEntity(newUser);
const user = await Db.collections.User!.save(newUser);
Logger.info('User updated successfully', { userId: user.id });
await issueCookie(res, user);
const updatedkeys = Object.keys(req.body);
void InternalHooksManager.getInstance().onUserUpdate({
user_id: req.user.id,
fields_changed: updatedkeys,
});
return sanitizeUser(user);
},
),
);
/**
* Update the logged-in user's password.
*/
this.app.patch(
`/${this.restEndpoint}/me/password`,
ResponseHelper.send(async (req: MeRequest.Password, res: express.Response) => {
const { currentPassword, newPassword } = req.body;
if (typeof currentPassword !== 'string' || typeof newPassword !== 'string') {
throw new ResponseHelper.ResponseError('Invalid payload.', undefined, 400);
}
if (!req.user.password) {
throw new ResponseHelper.ResponseError('Requesting user not set up.');
}
const isCurrentPwCorrect = await compare(currentPassword, req.user.password);
if (!isCurrentPwCorrect) {
throw new ResponseHelper.ResponseError(
'Provided current password is incorrect.',
undefined,
400,
);
}
const validPassword = validatePassword(newPassword);
req.user.password = hashSync(validPassword, genSaltSync(10));
const user = await Db.collections.User!.save(req.user);
Logger.info('Password updated successfully', { userId: user.id });
await issueCookie(res, user);
void InternalHooksManager.getInstance().onUserUpdate({
user_id: req.user.id,
fields_changed: ['password'],
});
return { success: true };
}),
);
/**
* Store the logged-in user's survey answers.
*/
this.app.post(
`/${this.restEndpoint}/me/survey`,
ResponseHelper.send(async (req: MeRequest.SurveyAnswers) => {
const { body: personalizationAnswers } = req;
if (!personalizationAnswers) {
Logger.debug(
'Request to store user personalization survey failed because of empty payload',
{
userId: req.user.id,
},
);
throw new ResponseHelper.ResponseError(
'Personalization answers are mandatory',
undefined,
400,
);
}
await Db.collections.User!.save({
id: req.user.id,
personalizationAnswers,
});
Logger.info('User survey updated successfully', { userId: req.user.id });
void InternalHooksManager.getInstance().onPersonalizationSurveySubmitted(
req.user.id,
personalizationAnswers,
);
return { success: true };
}),
);
}

View file

@ -0,0 +1,122 @@
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
import { hashSync, genSaltSync } from 'bcryptjs';
import * as express from 'express';
import validator from 'validator';
import { LoggerProxy as Logger } from 'n8n-workflow';
import { Db, InternalHooksManager, ResponseHelper } from '../..';
import config = require('../../../config');
import { validateEntity } from '../../GenericHelpers';
import { AuthenticatedRequest, OwnerRequest } from '../../requests';
import { issueCookie } from '../auth/jwt';
import { N8nApp } from '../Interfaces';
import { sanitizeUser, validatePassword } from '../UserManagementHelper';
export function ownerNamespace(this: N8nApp): void {
/**
* Promote a shell into the owner of the n8n instance,
* and enable `isInstanceOwnerSetUp` setting.
*/
this.app.post(
`/${this.restEndpoint}/owner`,
ResponseHelper.send(async (req: OwnerRequest.Post, res: express.Response) => {
const { email, firstName, lastName, password } = req.body;
const { id: userId } = req.user;
if (config.get('userManagement.isInstanceOwnerSetUp')) {
Logger.debug(
'Request to claim instance ownership failed because instance owner already exists',
{
userId,
},
);
throw new ResponseHelper.ResponseError('Invalid request', undefined, 400);
}
if (!email || !validator.isEmail(email)) {
Logger.debug('Request to claim instance ownership failed because of invalid email', {
userId,
invalidEmail: email,
});
throw new ResponseHelper.ResponseError('Invalid email address', undefined, 400);
}
const validPassword = validatePassword(password);
if (!firstName || !lastName) {
Logger.debug(
'Request to claim instance ownership failed because of missing first name or last name in payload',
{ userId, payload: req.body },
);
throw new ResponseHelper.ResponseError(
'First and last names are mandatory',
undefined,
400,
);
}
let owner = await Db.collections.User!.findOne(userId, {
relations: ['globalRole'],
});
if (!owner || (owner.globalRole.scope === 'global' && owner.globalRole.name !== 'owner')) {
Logger.debug(
'Request to claim instance ownership failed because user shell does not exist or has wrong role!',
{
userId,
},
);
throw new ResponseHelper.ResponseError('Invalid request', undefined, 400);
}
owner = Object.assign(owner, {
email,
firstName,
lastName,
password: hashSync(validPassword, genSaltSync(10)),
});
await validateEntity(owner);
owner = await Db.collections.User!.save(owner);
Logger.info('Owner was set up successfully', { userId: req.user.id });
await Db.collections.Settings!.update(
{ key: 'userManagement.isInstanceOwnerSetUp' },
{ value: JSON.stringify(true) },
);
config.set('userManagement.isInstanceOwnerSetUp', true);
Logger.debug('Setting isInstanceOwnerSetUp updated successfully', { userId: req.user.id });
await issueCookie(res, owner);
void InternalHooksManager.getInstance().onInstanceOwnerSetup({
user_id: userId,
});
return sanitizeUser(owner);
}),
);
/**
* Persist that the instance owner setup has been skipped
*/
this.app.post(
`/${this.restEndpoint}/owner/skip-setup`,
// eslint-disable-next-line @typescript-eslint/naming-convention
ResponseHelper.send(async (_req: AuthenticatedRequest, _res: express.Response) => {
await Db.collections.Settings!.update(
{ key: 'userManagement.skipInstanceOwnerSetup' },
{ value: JSON.stringify(true) },
);
config.set('userManagement.skipInstanceOwnerSetup', true);
return { success: true };
}),
);
}

View file

@ -0,0 +1,219 @@
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable import/no-cycle */
import express = require('express');
import { v4 as uuid } from 'uuid';
import { URL } from 'url';
import { genSaltSync, hashSync } from 'bcryptjs';
import validator from 'validator';
import { IsNull, MoreThanOrEqual, Not } from 'typeorm';
import { LoggerProxy as Logger } from 'n8n-workflow';
import { Db, InternalHooksManager, ResponseHelper } from '../..';
import { N8nApp } from '../Interfaces';
import { getInstanceBaseUrl, validatePassword } from '../UserManagementHelper';
import * as UserManagementMailer from '../email';
import type { PasswordResetRequest } from '../../requests';
import { issueCookie } from '../auth/jwt';
import config = require('../../../config');
export function passwordResetNamespace(this: N8nApp): void {
/**
* Send a password reset email.
*
* Authless endpoint.
*/
this.app.post(
`/${this.restEndpoint}/forgot-password`,
ResponseHelper.send(async (req: PasswordResetRequest.Email) => {
if (config.get('userManagement.emails.mode') === '') {
Logger.debug('Request to send password reset email failed because emailing was not set up');
throw new ResponseHelper.ResponseError(
'Email sending must be set up in order to request a password reset email',
undefined,
500,
);
}
const { email } = req.body;
if (!email) {
Logger.debug(
'Request to send password reset email failed because of missing email in payload',
{ payload: req.body },
);
throw new ResponseHelper.ResponseError('Email is mandatory', undefined, 400);
}
if (!validator.isEmail(email)) {
Logger.debug(
'Request to send password reset email failed because of invalid email in payload',
{ invalidEmail: email },
);
throw new ResponseHelper.ResponseError('Invalid email address', undefined, 400);
}
// User should just be able to reset password if one is already present
const user = await Db.collections.User!.findOne({ email, password: Not(IsNull()) });
if (!user || !user.password) {
Logger.debug(
'Request to send password reset email failed because no user was found for the provided email',
{ invalidEmail: email },
);
return;
}
user.resetPasswordToken = uuid();
const { id, firstName, lastName, resetPasswordToken } = user;
const resetPasswordTokenExpiration = Math.floor(Date.now() / 1000) + 7200;
await Db.collections.User!.update(id, { resetPasswordToken, resetPasswordTokenExpiration });
const baseUrl = getInstanceBaseUrl();
const url = new URL(`${baseUrl}/change-password`);
url.searchParams.append('userId', id);
url.searchParams.append('token', resetPasswordToken);
try {
const mailer = await UserManagementMailer.getInstance();
await mailer.passwordReset({
email,
firstName,
lastName,
passwordResetUrl: url.toString(),
domain: baseUrl,
});
} catch (error) {
void InternalHooksManager.getInstance().onEmailFailed({
user_id: user.id,
message_type: 'Reset password',
});
if (error instanceof Error) {
throw new ResponseHelper.ResponseError(
`Please contact your administrator: ${error.message}`,
undefined,
500,
);
}
}
Logger.info('Sent password reset email successfully', { userId: user.id, email });
void InternalHooksManager.getInstance().onUserTransactionalEmail({
user_id: id,
message_type: 'Reset password',
});
void InternalHooksManager.getInstance().onUserPasswordResetRequestClick({
user_id: id,
});
}),
);
/**
* Verify password reset token and user ID.
*
* Authless endpoint.
*/
this.app.get(
`/${this.restEndpoint}/resolve-password-token`,
ResponseHelper.send(async (req: PasswordResetRequest.Credentials) => {
const { token: resetPasswordToken, userId: id } = req.query;
if (!resetPasswordToken || !id) {
Logger.debug(
'Request to resolve password token failed because of missing password reset token or user ID in query string',
{
queryString: req.query,
},
);
throw new ResponseHelper.ResponseError('', undefined, 400);
}
// Timestamp is saved in seconds
const currentTimestamp = Math.floor(Date.now() / 1000);
const user = await Db.collections.User!.findOne({
id,
resetPasswordToken,
resetPasswordTokenExpiration: MoreThanOrEqual(currentTimestamp),
});
if (!user) {
Logger.debug(
'Request to resolve password token failed because no user was found for the provided user ID and reset password token',
{
userId: id,
resetPasswordToken,
},
);
throw new ResponseHelper.ResponseError('', undefined, 404);
}
Logger.info('Reset-password token resolved successfully', { userId: id });
void InternalHooksManager.getInstance().onUserPasswordResetEmailClick({
user_id: id,
});
}),
);
/**
* Verify password reset token and user ID and update password.
*
* Authless endpoint.
*/
this.app.post(
`/${this.restEndpoint}/change-password`,
ResponseHelper.send(async (req: PasswordResetRequest.NewPassword, res: express.Response) => {
const { token: resetPasswordToken, userId, password } = req.body;
if (!resetPasswordToken || !userId || !password) {
Logger.debug(
'Request to change password failed because of missing user ID or password or reset password token in payload',
{
payload: req.body,
},
);
throw new ResponseHelper.ResponseError(
'Missing user ID or password or reset password token',
undefined,
400,
);
}
const validPassword = validatePassword(password);
// Timestamp is saved in seconds
const currentTimestamp = Math.floor(Date.now() / 1000);
const user = await Db.collections.User!.findOne({
id: userId,
resetPasswordToken,
resetPasswordTokenExpiration: MoreThanOrEqual(currentTimestamp),
});
if (!user) {
Logger.debug(
'Request to resolve password token failed because no user was found for the provided user ID and reset password token',
{
userId,
resetPasswordToken,
},
);
throw new ResponseHelper.ResponseError('', undefined, 404);
}
await Db.collections.User!.update(userId, {
password: hashSync(validPassword, genSaltSync(10)),
resetPasswordToken: null,
resetPasswordTokenExpiration: null,
});
Logger.info('User password updated successfully', { userId });
await issueCookie(res, user);
}),
);
}

View file

@ -0,0 +1,562 @@
/* eslint-disable no-restricted-syntax */
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
import { Response } from 'express';
import { In } from 'typeorm';
import { genSaltSync, hashSync } from 'bcryptjs';
import validator from 'validator';
import { LoggerProxy as Logger } from 'n8n-workflow';
import { Db, InternalHooksManager, ITelemetryUserDeletionData, ResponseHelper } from '../..';
import { N8nApp, PublicUser } from '../Interfaces';
import { UserRequest } from '../../requests';
import {
getInstanceBaseUrl,
isEmailSetUp,
sanitizeUser,
validatePassword,
} from '../UserManagementHelper';
import { User } from '../../databases/entities/User';
import { SharedWorkflow } from '../../databases/entities/SharedWorkflow';
import { SharedCredentials } from '../../databases/entities/SharedCredentials';
import * as UserManagementMailer from '../email/UserManagementMailer';
import config = require('../../../config');
import { issueCookie } from '../auth/jwt';
export function usersNamespace(this: N8nApp): void {
/**
* Send email invite(s) to one or multiple users and create user shell(s).
*/
this.app.post(
`/${this.restEndpoint}/users`,
ResponseHelper.send(async (req: UserRequest.Invite) => {
if (config.get('userManagement.emails.mode') === '') {
Logger.debug(
'Request to send email invite(s) to user(s) failed because emailing was not set up',
);
throw new ResponseHelper.ResponseError(
'Email sending must be set up in order to request a password reset email',
undefined,
500,
);
}
let mailer: UserManagementMailer.UserManagementMailer | undefined;
try {
mailer = await UserManagementMailer.getInstance();
} catch (error) {
if (error instanceof Error) {
throw new ResponseHelper.ResponseError(
`There is a problem with your SMTP setup! ${error.message}`,
undefined,
500,
);
}
}
// TODO: this should be checked in the middleware rather than here
if (config.get('userManagement.disabled')) {
Logger.debug(
'Request to send email invite(s) to user(s) failed because user management is disabled',
);
throw new ResponseHelper.ResponseError('User management is disabled');
}
if (!config.get('userManagement.isInstanceOwnerSetUp')) {
Logger.debug(
'Request to send email invite(s) to user(s) failed because the owner account is not set up',
);
throw new ResponseHelper.ResponseError(
'You must set up your own account before inviting others',
undefined,
400,
);
}
if (!Array.isArray(req.body)) {
Logger.debug(
'Request to send email invite(s) to user(s) failed because the payload is not an array',
{
payload: req.body,
},
);
throw new ResponseHelper.ResponseError('Invalid payload', undefined, 400);
}
if (!req.body.length) return [];
const createUsers: { [key: string]: string | null } = {};
// Validate payload
req.body.forEach((invite) => {
if (typeof invite !== 'object' || !invite.email) {
throw new ResponseHelper.ResponseError(
'Request to send email invite(s) to user(s) failed because the payload is not an array shaped Array<{ email: string }>',
undefined,
400,
);
}
if (!validator.isEmail(invite.email)) {
Logger.debug('Invalid email in payload', { invalidEmail: invite.email });
throw new ResponseHelper.ResponseError(
`Request to send email invite(s) to user(s) failed because of an invalid email address: ${invite.email}`,
undefined,
400,
);
}
createUsers[invite.email] = null;
});
const role = await Db.collections.Role!.findOne({ scope: 'global', name: 'member' });
if (!role) {
Logger.error(
'Request to send email invite(s) to user(s) failed because no global member role was found in database',
);
throw new ResponseHelper.ResponseError(
'Members role not found in database - inconsistent state',
undefined,
500,
);
}
// remove/exclude existing users from creation
const existingUsers = await Db.collections.User!.find({
where: { email: In(Object.keys(createUsers)) },
});
existingUsers.forEach((user) => {
if (user.password) {
delete createUsers[user.email];
return;
}
createUsers[user.email] = user.id;
});
const usersToSetUp = Object.keys(createUsers).filter((email) => createUsers[email] === null);
const total = usersToSetUp.length;
Logger.debug(total > 1 ? `Creating ${total} user shells...` : `Creating 1 user shell...`);
try {
await Db.transaction(async (transactionManager) => {
return Promise.all(
usersToSetUp.map(async (email) => {
const newUser = Object.assign(new User(), {
email,
globalRole: role,
});
const savedUser = await transactionManager.save<User>(newUser);
createUsers[savedUser.email] = savedUser.id;
return savedUser;
}),
);
});
void InternalHooksManager.getInstance().onUserInvite({
user_id: req.user.id,
target_user_id: Object.values(createUsers) as string[],
});
} catch (error) {
Logger.error('Failed to create user shells', { userShells: createUsers });
throw new ResponseHelper.ResponseError('An error occurred during user creation');
}
Logger.info('Created user shell(s) successfully', { userId: req.user.id });
Logger.verbose(total > 1 ? `${total} user shells created` : `1 user shell created`, {
userShells: createUsers,
});
const baseUrl = getInstanceBaseUrl();
const usersPendingSetup = Object.entries(createUsers).filter(([email, id]) => id && email);
// send invite email to new or not yet setup users
const emailingResults = await Promise.all(
usersPendingSetup.map(async ([email, id]) => {
// eslint-disable-next-line @typescript-eslint/restrict-template-expressions
const inviteAcceptUrl = `${baseUrl}/signup?inviterId=${req.user.id}&inviteeId=${id}`;
const result = await mailer?.invite({
email,
inviteAcceptUrl,
domain: baseUrl,
});
const resp: { user: { id: string | null; email: string }; error?: string } = {
user: {
id,
email,
},
};
if (result?.success) {
void InternalHooksManager.getInstance().onUserTransactionalEmail({
user_id: id!,
message_type: 'New user invite',
});
} else {
void InternalHooksManager.getInstance().onEmailFailed({
user_id: req.user.id,
message_type: 'New user invite',
});
Logger.error('Failed to send email', {
userId: req.user.id,
inviteAcceptUrl,
domain: baseUrl,
email,
});
resp.error = `Email could not be sent`;
}
return resp;
}),
);
Logger.debug(
usersPendingSetup.length > 1
? `Sent ${usersPendingSetup.length} invite emails successfully`
: `Sent 1 invite email successfully`,
{ userShells: createUsers },
);
return emailingResults;
}),
);
/**
* Validate invite token to enable invitee to set up their account.
*
* Authless endpoint.
*/
this.app.get(
`/${this.restEndpoint}/resolve-signup-token`,
ResponseHelper.send(async (req: UserRequest.ResolveSignUp) => {
const { inviterId, inviteeId } = req.query;
if (!inviterId || !inviteeId) {
Logger.debug(
'Request to resolve signup token failed because of missing user IDs in query string',
{ inviterId, inviteeId },
);
throw new ResponseHelper.ResponseError('Invalid payload', undefined, 400);
}
// Postgres validates UUID format
for (const userId of [inviterId, inviteeId]) {
if (!validator.isUUID(userId)) {
Logger.debug('Request to resolve signup token failed because of invalid user ID', {
userId,
});
throw new ResponseHelper.ResponseError('Invalid userId', undefined, 400);
}
}
const users = await Db.collections.User!.find({ where: { id: In([inviterId, inviteeId]) } });
if (users.length !== 2) {
Logger.debug(
'Request to resolve signup token failed because the ID of the inviter and/or the ID of the invitee were not found in database',
{ inviterId, inviteeId },
);
throw new ResponseHelper.ResponseError('Invalid invite URL', undefined, 400);
}
const invitee = users.find((user) => user.id === inviteeId);
if (!invitee || invitee.password) {
Logger.error('Invalid invite URL - invitee already setup', {
inviterId,
inviteeId,
});
throw new ResponseHelper.ResponseError('The invitation was likely either deleted or already claimed', undefined, 400);
}
const inviter = users.find((user) => user.id === inviterId);
if (!inviter || !inviter.email || !inviter.firstName) {
Logger.error(
'Request to resolve signup token failed because inviter does not exist or is not set up',
{
inviterId: inviter?.id,
},
);
throw new ResponseHelper.ResponseError('Invalid request', undefined, 400);
}
void InternalHooksManager.getInstance().onUserInviteEmailClick({
user_id: inviteeId,
});
const { firstName, lastName } = inviter;
return { inviter: { firstName, lastName } };
}),
);
/**
* Fill out user shell with first name, last name, and password.
*
* Authless endpoint.
*/
this.app.post(
`/${this.restEndpoint}/users/:id`,
ResponseHelper.send(async (req: UserRequest.Update, res: Response) => {
const { id: inviteeId } = req.params;
const { inviterId, firstName, lastName, password } = req.body;
if (!inviterId || !inviteeId || !firstName || !lastName || !password) {
Logger.debug(
'Request to fill out a user shell failed because of missing properties in payload',
{ payload: req.body },
);
throw new ResponseHelper.ResponseError('Invalid payload', undefined, 400);
}
const validPassword = validatePassword(password);
const users = await Db.collections.User!.find({
where: { id: In([inviterId, inviteeId]) },
relations: ['globalRole'],
});
if (users.length !== 2) {
Logger.debug(
'Request to fill out a user shell failed because the inviter ID and/or invitee ID were not found in database',
{
inviterId,
inviteeId,
},
);
throw new ResponseHelper.ResponseError('Invalid payload or URL', undefined, 400);
}
const invitee = users.find((user) => user.id === inviteeId) as User;
if (invitee.password) {
Logger.debug(
'Request to fill out a user shell failed because the invite had already been accepted',
{ inviteeId },
);
throw new ResponseHelper.ResponseError(
'This invite has been accepted already',
undefined,
400,
);
}
invitee.firstName = firstName;
invitee.lastName = lastName;
invitee.password = hashSync(validPassword, genSaltSync(10));
const updatedUser = await Db.collections.User!.save(invitee);
await issueCookie(res, updatedUser);
void InternalHooksManager.getInstance().onUserSignup({
user_id: invitee.id,
});
return sanitizeUser(updatedUser);
}),
);
this.app.get(
`/${this.restEndpoint}/users`,
ResponseHelper.send(async () => {
const users = await Db.collections.User!.find({ relations: ['globalRole'] });
return users.map((user): PublicUser => sanitizeUser(user, ['personalizationAnswers']));
}),
);
/**
* Delete a user. Optionally, designate a transferee for their workflows and credentials.
*/
this.app.delete(
`/${this.restEndpoint}/users/:id`,
ResponseHelper.send(async (req: UserRequest.Delete) => {
const { id: idToDelete } = req.params;
if (req.user.id === idToDelete) {
Logger.debug(
'Request to delete a user failed because it attempted to delete the requesting user',
{ userId: req.user.id },
);
throw new ResponseHelper.ResponseError('Cannot delete your own user', undefined, 400);
}
const { transferId } = req.query;
if (transferId === idToDelete) {
throw new ResponseHelper.ResponseError(
'Request to delete a user failed because the user to delete and the transferee are the same user',
undefined,
400,
);
}
const users = await Db.collections.User!.find({
where: { id: In([transferId, idToDelete]) },
});
if (!users.length || (transferId && users.length !== 2)) {
throw new ResponseHelper.ResponseError(
'Request to delete a user failed because the ID of the user to delete and/or the ID of the transferee were not found in DB',
undefined,
404,
);
}
const userToDelete = users.find((user) => user.id === req.params.id) as User;
if (transferId) {
const transferee = users.find((user) => user.id === transferId);
await Db.transaction(async (transactionManager) => {
await transactionManager.update(
SharedWorkflow,
{ user: userToDelete },
{ user: transferee },
);
await transactionManager.update(
SharedCredentials,
{ user: userToDelete },
{ user: transferee },
);
await transactionManager.delete(User, { id: userToDelete.id });
});
return { success: true };
}
const [ownedSharedWorkflows, ownedSharedCredentials] = await Promise.all([
Db.collections.SharedWorkflow!.find({
relations: ['workflow'],
where: { user: userToDelete },
}),
Db.collections.SharedCredentials!.find({
relations: ['credentials'],
where: { user: userToDelete },
}),
]);
await Db.transaction(async (transactionManager) => {
const ownedWorkflows = await Promise.all(
ownedSharedWorkflows.map(async ({ workflow }) => {
if (workflow.active) {
// deactivate before deleting
await this.activeWorkflowRunner.remove(workflow.id.toString());
}
return workflow;
}),
);
await transactionManager.remove(ownedWorkflows);
await transactionManager.remove(
ownedSharedCredentials.map(({ credentials }) => credentials),
);
await transactionManager.delete(User, { id: userToDelete.id });
});
const telemetryData: ITelemetryUserDeletionData = {
user_id: req.user.id,
target_user_old_status: userToDelete.isPending ? 'invited' : 'active',
target_user_id: idToDelete,
};
telemetryData.migration_strategy = transferId ? 'transfer_data' : 'delete_data';
if (transferId) {
telemetryData.migration_user_id = transferId;
}
void InternalHooksManager.getInstance().onUserDeletion(req.user.id, telemetryData);
return { success: true };
}),
);
/**
* Resend email invite to user.
*/
this.app.post(
`/${this.restEndpoint}/users/:id/reinvite`,
ResponseHelper.send(async (req: UserRequest.Reinvite) => {
const { id: idToReinvite } = req.params;
if (!isEmailSetUp()) {
Logger.error('Request to reinvite a user failed because email sending was not set up');
throw new ResponseHelper.ResponseError(
'Email sending must be set up in order to invite other users',
undefined,
500,
);
}
const reinvitee = await Db.collections.User!.findOne({ id: idToReinvite });
if (!reinvitee) {
Logger.debug(
'Request to reinvite a user failed because the ID of the reinvitee was not found in database',
);
throw new ResponseHelper.ResponseError('Could not find user', undefined, 404);
}
if (reinvitee.password) {
Logger.debug(
'Request to reinvite a user failed because the invite had already been accepted',
{ userId: reinvitee.id },
);
throw new ResponseHelper.ResponseError(
'User has already accepted the invite',
undefined,
400,
);
}
const baseUrl = getInstanceBaseUrl();
const inviteAcceptUrl = `${baseUrl}/signup?inviterId=${req.user.id}&inviteeId=${reinvitee.id}`;
let mailer: UserManagementMailer.UserManagementMailer | undefined;
try {
mailer = await UserManagementMailer.getInstance();
} catch (error) {
if (error instanceof Error) {
throw new ResponseHelper.ResponseError(error.message, undefined, 500);
}
}
const result = await mailer?.invite({
email: reinvitee.email,
inviteAcceptUrl,
domain: baseUrl,
});
if (!result?.success) {
void InternalHooksManager.getInstance().onEmailFailed({
user_id: req.user.id,
message_type: 'Resend invite',
});
Logger.error('Failed to send email', {
email: reinvitee.email,
inviteAcceptUrl,
domain: baseUrl,
});
throw new ResponseHelper.ResponseError(
`Failed to send email to ${reinvitee.email}`,
undefined,
500,
);
}
void InternalHooksManager.getInstance().onUserReinvite({
user_id: req.user.id,
target_user_id: reinvitee.id,
});
void InternalHooksManager.getInstance().onUserTransactionalEmail({
user_id: reinvitee.id,
message_type: 'Resend invite',
});
return { success: true };
}),
);
}

View file

@ -24,6 +24,7 @@ import {
WorkflowCredentials, WorkflowCredentials,
WorkflowRunner, WorkflowRunner,
} from '.'; } from '.';
import { getWorkflowOwner } from './UserManagement/UserManagementHelper';
export class WaitTrackerClass { export class WaitTrackerClass {
activeExecutionsInstance: ActiveExecutions.ActiveExecutions; activeExecutionsInstance: ActiveExecutions.ActiveExecutions;
@ -157,10 +158,16 @@ export class WaitTrackerClass {
throw new Error('The execution did succeed and can so not be started again.'); throw new Error('The execution did succeed and can so not be started again.');
} }
if (!fullExecutionData.workflowData.id) {
throw new Error('Only saved workflows can be resumed.');
}
const user = await getWorkflowOwner(fullExecutionData.workflowData.id.toString());
const data: IWorkflowExecutionDataProcess = { const data: IWorkflowExecutionDataProcess = {
executionMode: fullExecutionData.mode, executionMode: fullExecutionData.mode,
executionData: fullExecutionData.data, executionData: fullExecutionData.data,
workflowData: fullExecutionData.workflowData, workflowData: fullExecutionData.workflowData,
userId: user.id,
}; };
// Start the execution again // Start the execution again

View file

@ -26,6 +26,7 @@ import {
WorkflowCredentials, WorkflowCredentials,
WorkflowExecuteAdditionalData, WorkflowExecuteAdditionalData,
} from '.'; } from '.';
import { getWorkflowOwner } from './UserManagement/UserManagementHelper';
export class WaitingWebhooks { export class WaitingWebhooks {
async executeWebhook( async executeWebhook(
@ -111,7 +112,14 @@ export class WaitingWebhooks {
settings: workflowData.settings, settings: workflowData.settings,
}); });
const additionalData = await WorkflowExecuteAdditionalData.getBase(); let workflowOwner;
try {
workflowOwner = await getWorkflowOwner(workflowData.id!.toString());
} catch (error) {
throw new ResponseHelper.ResponseError('Could not find workflow', undefined, 404);
}
const additionalData = await WorkflowExecuteAdditionalData.getBase(workflowOwner.id);
const webhookData = NodeHelpers.getNodeWebhooks( const webhookData = NodeHelpers.getNodeWebhooks(
workflow, workflow,

View file

@ -1,3 +1,4 @@
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-unsafe-call */ /* eslint-disable @typescript-eslint/no-unsafe-call */
/* eslint-disable no-param-reassign */ /* eslint-disable no-param-reassign */
/* eslint-disable @typescript-eslint/prefer-optional-chain */ /* eslint-disable @typescript-eslint/prefer-optional-chain */
@ -53,6 +54,9 @@ import {
// eslint-disable-next-line import/no-cycle // eslint-disable-next-line import/no-cycle
import * as ActiveExecutions from './ActiveExecutions'; import * as ActiveExecutions from './ActiveExecutions';
import { User } from './databases/entities/User';
import { WorkflowEntity } from './databases/entities/WorkflowEntity';
import { getWorkflowOwner } from './UserManagement/UserManagementHelper';
const activeExecutions = ActiveExecutions.getInstance(); const activeExecutions = ActiveExecutions.getInstance();
@ -223,8 +227,22 @@ export async function executeWebhook(
throw new ResponseHelper.ResponseError(errorMessage, 500, 500); throw new ResponseHelper.ResponseError(errorMessage, 500, 500);
} }
let user: User;
if (
(workflowData as WorkflowEntity).shared?.length &&
(workflowData as WorkflowEntity).shared[0].user
) {
user = (workflowData as WorkflowEntity).shared[0].user;
} else {
try {
user = await getWorkflowOwner(workflowData.id.toString());
} catch (error) {
throw new ResponseHelper.ResponseError('Cannot find workflow', undefined, 404);
}
}
// Prepare everything that is needed to run the workflow // Prepare everything that is needed to run the workflow
const additionalData = await WorkflowExecuteAdditionalData.getBase(); const additionalData = await WorkflowExecuteAdditionalData.getBase(user.id);
// Add the Response and Request so that this data can be accessed in the node // Add the Response and Request so that this data can be accessed in the node
additionalData.httpRequest = req; additionalData.httpRequest = req;
@ -404,6 +422,7 @@ export async function executeWebhook(
executionData: runExecutionData, executionData: runExecutionData,
sessionId, sessionId,
workflowData, workflowData,
userId: user.id,
}; };
let responsePromise: IDeferredPromise<IN8nHttpFullResponse> | undefined; let responsePromise: IDeferredPromise<IN8nHttpFullResponse> | undefined;

View file

@ -1,3 +1,4 @@
/* eslint-disable import/no-cycle */
/* eslint-disable no-restricted-syntax */ /* eslint-disable no-restricted-syntax */
/* eslint-disable @typescript-eslint/restrict-plus-operands */ /* eslint-disable @typescript-eslint/restrict-plus-operands */
/* eslint-disable @typescript-eslint/explicit-module-boundary-types */ /* eslint-disable @typescript-eslint/explicit-module-boundary-types */
@ -39,7 +40,6 @@ import {
import { LessThanOrEqual } from 'typeorm'; import { LessThanOrEqual } from 'typeorm';
import { DateUtils } from 'typeorm/util/DateUtils'; import { DateUtils } from 'typeorm/util/DateUtils';
import * as config from '../config'; import * as config from '../config';
// eslint-disable-next-line import/no-cycle
import { import {
ActiveExecutions, ActiveExecutions,
CredentialsHelper, CredentialsHelper,
@ -60,6 +60,12 @@ import {
WorkflowCredentials, WorkflowCredentials,
WorkflowHelpers, WorkflowHelpers,
} from '.'; } from '.';
import {
checkPermissionsForExecution,
getUserById,
getWorkflowOwner,
} from './UserManagement/UserManagementHelper';
import { whereClause } from './WorkflowHelpers';
const ERROR_TRIGGER_TYPE = config.get('nodes.errorTriggerType') as string; const ERROR_TRIGGER_TYPE = config.get('nodes.errorTriggerType') as string;
@ -120,20 +126,36 @@ function executeErrorWorkflow(
workflowId: workflowData.id, workflowId: workflowData.id,
}); });
// If a specific error workflow is set run only that one // If a specific error workflow is set run only that one
// First, do permission checks.
if (!workflowData.id) {
// Manual executions do not trigger error workflows
// So this if should never happen. It was added to
// make sure there are no possible security gaps
return;
}
// eslint-disable-next-line @typescript-eslint/no-floating-promises // eslint-disable-next-line @typescript-eslint/no-floating-promises
WorkflowHelpers.executeErrorWorkflow( getWorkflowOwner(workflowData.id).then((user) => {
workflowData.settings.errorWorkflow as string, void WorkflowHelpers.executeErrorWorkflow(
workflowErrorData, workflowData.settings!.errorWorkflow as string,
); workflowErrorData,
user,
);
});
} else if ( } else if (
mode !== 'error' && mode !== 'error' &&
workflowData.id !== undefined && workflowData.id !== undefined &&
workflowData.nodes.some((node) => node.type === ERROR_TRIGGER_TYPE) workflowData.nodes.some((node) => node.type === ERROR_TRIGGER_TYPE)
) { ) {
Logger.verbose(`Start internal error workflow`, { executionId, workflowId: workflowData.id }); Logger.verbose(`Start internal error workflow`, { executionId, workflowId: workflowData.id });
// If the workflow contains void getWorkflowOwner(workflowData.id).then((user) => {
// eslint-disable-next-line @typescript-eslint/no-floating-promises void WorkflowHelpers.executeErrorWorkflow(
WorkflowHelpers.executeErrorWorkflow(workflowData.id.toString(), workflowErrorData); workflowData.id!.toString(),
workflowErrorData,
user,
);
});
} }
} }
} }
@ -698,6 +720,7 @@ function hookFunctionsSaveWorker(): IWorkflowExecuteHooks {
export async function getRunData( export async function getRunData(
workflowData: IWorkflowBase, workflowData: IWorkflowBase,
userId: string,
inputData?: INodeExecutionData[], inputData?: INodeExecutionData[],
): Promise<IWorkflowExecutionDataProcess> { ): Promise<IWorkflowExecutionDataProcess> {
const mode = 'integrated'; const mode = 'integrated';
@ -751,27 +774,47 @@ export async function getRunData(
executionData: runExecutionData, executionData: runExecutionData,
// @ts-ignore // @ts-ignore
workflowData, workflowData,
userId,
}; };
return runData; return runData;
} }
export async function getWorkflowData(workflowInfo: IExecuteWorkflowInfo): Promise<IWorkflowBase> { export async function getWorkflowData(
workflowInfo: IExecuteWorkflowInfo,
userId: string,
): Promise<IWorkflowBase> {
if (workflowInfo.id === undefined && workflowInfo.code === undefined) { if (workflowInfo.id === undefined && workflowInfo.code === undefined) {
throw new Error( throw new Error(
`No information about the workflow to execute found. Please provide either the "id" or "code"!`, `No information about the workflow to execute found. Please provide either the "id" or "code"!`,
); );
} }
if (Db.collections.Workflow === null) {
// The first time executeWorkflow gets called the Database has
// to get initialized first
await Db.init();
}
let workflowData: IWorkflowBase | undefined; let workflowData: IWorkflowBase | undefined;
if (workflowInfo.id !== undefined) { if (workflowInfo.id !== undefined) {
workflowData = await Db.collections.Workflow!.findOne(workflowInfo.id); if (Db.collections.Workflow === null) {
// The first time executeWorkflow gets called the Database has
// to get initialized first
await Db.init();
}
const user = await getUserById(userId);
let relations = ['workflow', 'workflow.tags'];
if (config.get('workflowTagsDisabled')) {
relations = relations.filter((relation) => relation !== 'workflow.tags');
}
const shared = await Db.collections.SharedWorkflow!.findOne({
relations,
where: whereClause({
user,
entityType: 'workflow',
entityId: workflowInfo.id,
}),
});
workflowData = shared?.workflow;
if (workflowData === undefined) { if (workflowData === undefined) {
throw new Error(`The workflow with the id "${workflowInfo.id}" does not exist.`); throw new Error(`The workflow with the id "${workflowInfo.id}" does not exist.`);
} }
@ -805,7 +848,7 @@ export async function executeWorkflow(
const nodeTypes = NodeTypes(); const nodeTypes = NodeTypes();
const workflowData = const workflowData =
loadedWorkflowData !== undefined ? loadedWorkflowData : await getWorkflowData(workflowInfo); loadedWorkflowData ?? (await getWorkflowData(workflowInfo, additionalData.userId));
const workflowName = workflowData ? workflowData.name : undefined; const workflowName = workflowData ? workflowData.name : undefined;
const workflow = new Workflow({ const workflow = new Workflow({
@ -819,7 +862,7 @@ export async function executeWorkflow(
}); });
const runData = const runData =
loadedRunData !== undefined ? loadedRunData : await getRunData(workflowData, inputData); loadedRunData ?? (await getRunData(workflowData, additionalData.userId, inputData));
let executionId; let executionId;
@ -834,9 +877,11 @@ export async function executeWorkflow(
let data; let data;
try { try {
await checkPermissionsForExecution(workflow, additionalData.userId);
// Create new additionalData to have different workflow loaded and to call // Create new additionalData to have different workflow loaded and to call
// different webooks // different webooks
const additionalDataIntegrated = await getBase(); const additionalDataIntegrated = await getBase(additionalData.userId);
additionalDataIntegrated.hooks = getWorkflowHooksIntegrated( additionalDataIntegrated.hooks = getWorkflowHooksIntegrated(
runData.executionMode, runData.executionMode,
executionId, executionId,
@ -908,6 +953,9 @@ export async function executeWorkflow(
stoppedAt: fullRunData.stoppedAt, stoppedAt: fullRunData.stoppedAt,
workflowData, workflowData,
}; };
if (workflowData.id) {
fullExecutionData.workflowId = workflowData.id as string;
}
const executionData = ResponseHelper.flattenExecutionData(fullExecutionData); const executionData = ResponseHelper.flattenExecutionData(fullExecutionData);
@ -919,7 +967,12 @@ export async function executeWorkflow(
} }
await externalHooks.run('workflow.postExecute', [data, workflowData, executionId]); await externalHooks.run('workflow.postExecute', [data, workflowData, executionId]);
void InternalHooksManager.getInstance().onWorkflowPostExecute(executionId, workflowData, data); void InternalHooksManager.getInstance().onWorkflowPostExecute(
executionId,
workflowData,
data,
additionalData.userId,
);
if (data.finished === true) { if (data.finished === true) {
// Workflow did finish successfully // Workflow did finish successfully
@ -969,6 +1022,7 @@ export function sendMessageToUI(source: string, messages: any[]) {
* @returns {Promise<IWorkflowExecuteAdditionalData>} * @returns {Promise<IWorkflowExecuteAdditionalData>}
*/ */
export async function getBase( export async function getBase(
userId: string,
currentNodeParameters?: INodeParameters, currentNodeParameters?: INodeParameters,
executionTimeoutTimestamp?: number, executionTimeoutTimestamp?: number,
): Promise<IWorkflowExecuteAdditionalData> { ): Promise<IWorkflowExecuteAdditionalData> {
@ -995,6 +1049,7 @@ export async function getBase(
webhookTestBaseUrl, webhookTestBaseUrl,
currentNodeParameters, currentNodeParameters,
executionTimeoutTimestamp, executionTimeoutTimestamp,
userId,
}; };
} }

View file

@ -1,3 +1,4 @@
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-unsafe-call */ /* eslint-disable @typescript-eslint/no-unsafe-call */
/* eslint-disable @typescript-eslint/no-unsafe-return */ /* eslint-disable @typescript-eslint/no-unsafe-return */
/* eslint-disable @typescript-eslint/no-unsafe-assignment */ /* eslint-disable @typescript-eslint/no-unsafe-assignment */
@ -19,7 +20,6 @@ import {
LoggerProxy as Logger, LoggerProxy as Logger,
Workflow, Workflow,
} from 'n8n-workflow'; } from 'n8n-workflow';
import { validate } from 'class-validator';
// eslint-disable-next-line import/no-cycle // eslint-disable-next-line import/no-cycle
import { import {
CredentialTypes, CredentialTypes,
@ -29,13 +29,15 @@ import {
IWorkflowErrorData, IWorkflowErrorData,
IWorkflowExecutionDataProcess, IWorkflowExecutionDataProcess,
NodeTypes, NodeTypes,
ResponseHelper, WhereClause,
WorkflowRunner, WorkflowRunner,
} from '.'; } from '.';
import * as config from '../config'; import * as config from '../config';
// eslint-disable-next-line import/no-cycle // eslint-disable-next-line import/no-cycle
import { WorkflowEntity } from './databases/entities/WorkflowEntity'; import { WorkflowEntity } from './databases/entities/WorkflowEntity';
import { User } from './databases/entities/User';
import { getWorkflowOwner } from './UserManagement/UserManagementHelper';
const ERROR_TRIGGER_TYPE = config.get('nodes.errorTriggerType') as string; const ERROR_TRIGGER_TYPE = config.get('nodes.errorTriggerType') as string;
@ -91,10 +93,36 @@ export function isWorkflowIdValid(id: string | null | undefined | number): boole
export async function executeErrorWorkflow( export async function executeErrorWorkflow(
workflowId: string, workflowId: string,
workflowErrorData: IWorkflowErrorData, workflowErrorData: IWorkflowErrorData,
runningUser: User,
): Promise<void> { ): Promise<void> {
// Wrap everything in try/catch to make sure that no errors bubble up and all get caught here // Wrap everything in try/catch to make sure that no errors bubble up and all get caught here
try { try {
const workflowData = await Db.collections.Workflow!.findOne({ id: Number(workflowId) }); let workflowData;
if (workflowId.toString() !== workflowErrorData.workflow.id?.toString()) {
// To make this code easier to understand, we split it in 2 parts:
// 1) Fetch the owner of the errored workflows and then
// 2) if now instance owner, then check if the user has access to the
// triggered workflow.
const user = await getWorkflowOwner(workflowErrorData.workflow.id!);
if (user.globalRole.name === 'owner') {
workflowData = await Db.collections.Workflow!.findOne({ id: Number(workflowId) });
} else {
const sharedWorkflowData = await Db.collections.SharedWorkflow!.findOne({
where: {
workflow: { id: workflowId },
user,
},
relations: ['workflow'],
});
if (sharedWorkflowData) {
workflowData = sharedWorkflowData.workflow;
}
}
} else {
workflowData = await Db.collections.Workflow!.findOne({ id: Number(workflowId) });
}
if (workflowData === undefined) { if (workflowData === undefined) {
// The error workflow could not be found // The error workflow could not be found
@ -106,6 +134,15 @@ export async function executeErrorWorkflow(
return; return;
} }
const user = await getWorkflowOwner(workflowId);
if (user.id !== runningUser.id) {
// The error workflow could not be found
Logger.warn(
`An attempt to execute workflow ID ${workflowId} as error workflow was blocked due to wrong permission`,
);
return;
}
const executionMode = 'error'; const executionMode = 'error';
const nodeTypes = NodeTypes(); const nodeTypes = NodeTypes();
@ -169,6 +206,7 @@ export async function executeErrorWorkflow(
executionMode, executionMode,
executionData: runExecutionData, executionData: runExecutionData,
workflowData, workflowData,
userId: user.id,
}; };
const workflowRunner = new WorkflowRunner(); const workflowRunner = new WorkflowRunner();
@ -521,34 +559,40 @@ export async function replaceInvalidCredentials(workflow: WorkflowEntity): Promi
return workflow; return workflow;
} }
// TODO: Deduplicate `validateWorkflow` and `throwDuplicateEntryError` with TagHelpers? /**
* Build a `where` clause for a TypeORM entity search,
* checking for member access if the user is not an owner.
*/
export function whereClause({
user,
entityType,
entityId = '',
}: {
user: User;
entityType: 'workflow' | 'credentials';
entityId?: string;
}): WhereClause {
const where: WhereClause = entityId ? { [entityType]: { id: entityId } } : {};
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types // TODO: Decide if owner access should be restricted
export async function validateWorkflow(newWorkflow: WorkflowEntity) { if (user.globalRole.name !== 'owner') {
const errors = await validate(newWorkflow); where.user = { id: user.id };
if (errors.length) {
const validationErrorMessage = Object.values(errors[0].constraints!)[0];
throw new ResponseHelper.ResponseError(validationErrorMessage, undefined, 400);
}
}
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
export function throwDuplicateEntryError(error: Error) {
const errorMessage = error.message.toLowerCase();
if (errorMessage.includes('unique') || errorMessage.includes('duplicate')) {
throw new ResponseHelper.ResponseError(
'There is already a workflow with this name',
undefined,
400,
);
} }
throw new ResponseHelper.ResponseError(errorMessage, undefined, 400); return where;
} }
export type NameRequest = Express.Request & { /**
query: { * Get the IDs of the workflows that have been shared with the user.
name?: string; */
}; export async function getSharedWorkflowIds(user: User): Promise<number[]> {
}; const sharedWorkflows = await Db.collections.SharedWorkflow!.find({
relations: ['workflow'],
where: whereClause({
user,
entityType: 'workflow',
}),
});
return sharedWorkflows.map(({ workflow }) => workflow.id);
}

View file

@ -57,6 +57,7 @@ import {
} from '.'; } from '.';
import * as Queue from './Queue'; import * as Queue from './Queue';
import { InternalHooksManager } from './InternalHooksManager'; import { InternalHooksManager } from './InternalHooksManager';
import { checkPermissionsForExecution } from './UserManagement/UserManagementHelper';
export class WorkflowRunner { export class WorkflowRunner {
activeExecutions: ActiveExecutions.ActiveExecutions; activeExecutions: ActiveExecutions.ActiveExecutions;
@ -177,6 +178,7 @@ export class WorkflowRunner {
executionId!, executionId!,
data.workflowData, data.workflowData,
executionData, executionData,
data.userId,
); );
}) })
.catch((error) => { .catch((error) => {
@ -246,6 +248,7 @@ export class WorkflowRunner {
staticData: data.workflowData.staticData, staticData: data.workflowData.staticData,
}); });
const additionalData = await WorkflowExecuteAdditionalData.getBase( const additionalData = await WorkflowExecuteAdditionalData.getBase(
data.userId,
undefined, undefined,
workflowTimeout <= 0 ? undefined : Date.now() + workflowTimeout * 1000, workflowTimeout <= 0 ? undefined : Date.now() + workflowTimeout * 1000,
); );
@ -265,6 +268,9 @@ export class WorkflowRunner {
`Execution for workflow ${data.workflowData.name} was assigned id ${executionId}`, `Execution for workflow ${data.workflowData.name} was assigned id ${executionId}`,
{ executionId }, { executionId },
); );
await checkPermissionsForExecution(workflow, data.userId);
additionalData.hooks = WorkflowExecuteAdditionalData.getWorkflowHooksMain( additionalData.hooks = WorkflowExecuteAdditionalData.getWorkflowHooksMain(
data, data,
executionId, executionId,

View file

@ -52,6 +52,7 @@ import { getLogger } from './Logger';
import * as config from '../config'; import * as config from '../config';
import { InternalHooksManager } from './InternalHooksManager'; import { InternalHooksManager } from './InternalHooksManager';
import { checkPermissionsForExecution } from './UserManagement/UserManagementHelper';
export class WorkflowRunnerProcess { export class WorkflowRunnerProcess {
data: IWorkflowExecutionDataProcessWithExecution | undefined; data: IWorkflowExecutionDataProcessWithExecution | undefined;
@ -88,6 +89,7 @@ export class WorkflowRunnerProcess {
LoggerProxy.init(logger); LoggerProxy.init(logger);
this.data = inputData; this.data = inputData;
const { userId } = inputData;
logger.verbose('Initializing n8n sub-process', { logger.verbose('Initializing n8n sub-process', {
pid: process.pid, pid: process.pid,
@ -235,7 +237,9 @@ export class WorkflowRunnerProcess {
staticData: this.data.workflowData.staticData, staticData: this.data.workflowData.staticData,
settings: this.data.workflowData.settings, settings: this.data.workflowData.settings,
}); });
await checkPermissionsForExecution(this.workflow, userId);
const additionalData = await WorkflowExecuteAdditionalData.getBase( const additionalData = await WorkflowExecuteAdditionalData.getBase(
userId,
undefined, undefined,
workflowTimeout <= 0 ? undefined : Date.now() + workflowTimeout * 1000, workflowTimeout <= 0 ? undefined : Date.now() + workflowTimeout * 1000,
); );
@ -273,8 +277,15 @@ export class WorkflowRunnerProcess {
additionalData: IWorkflowExecuteAdditionalData, additionalData: IWorkflowExecuteAdditionalData,
inputData?: INodeExecutionData[] | undefined, inputData?: INodeExecutionData[] | undefined,
): Promise<Array<INodeExecutionData[] | null> | IRun> => { ): Promise<Array<INodeExecutionData[] | null> | IRun> => {
const workflowData = await WorkflowExecuteAdditionalData.getWorkflowData(workflowInfo); const workflowData = await WorkflowExecuteAdditionalData.getWorkflowData(
const runData = await WorkflowExecuteAdditionalData.getRunData(workflowData, inputData); workflowInfo,
userId,
);
const runData = await WorkflowExecuteAdditionalData.getRunData(
workflowData,
additionalData.userId,
inputData,
);
await sendToParentProcess('startExecution', { runData }); await sendToParentProcess('startExecution', { runData });
const executionId: string = await new Promise((resolve) => { const executionId: string = await new Promise((resolve) => {
this.executionIdCallback = (executionId: string) => { this.executionIdCallback = (executionId: string) => {
@ -300,6 +311,7 @@ export class WorkflowRunnerProcess {
executionId, executionId,
workflowData, workflowData,
result, result,
additionalData.userId,
); );
await sendToParentProcess('finishExecution', { executionId, result }); await sendToParentProcess('finishExecution', { executionId, result });
delete this.childExecutions[executionId]; delete this.childExecutions[executionId];

View file

@ -0,0 +1,419 @@
/* eslint-disable @typescript-eslint/no-shadow */
/* eslint-disable @typescript-eslint/no-unused-vars */
/* eslint-disable no-restricted-syntax */
/* eslint-disable @typescript-eslint/no-non-null-assertion */
/* eslint-disable import/no-cycle */
import express = require('express');
import { In } from 'typeorm';
import { UserSettings, Credentials } from 'n8n-core';
import { INodeCredentialTestResult, LoggerProxy } from 'n8n-workflow';
import { getLogger } from '../Logger';
import {
CredentialsHelper,
Db,
GenericHelpers,
ICredentialsDb,
ICredentialsResponse,
whereClause,
ResponseHelper,
} from '..';
import { RESPONSE_ERROR_MESSAGES } from '../constants';
import { CredentialsEntity } from '../databases/entities/CredentialsEntity';
import { SharedCredentials } from '../databases/entities/SharedCredentials';
import { validateEntity } from '../GenericHelpers';
import type { CredentialRequest } from '../requests';
import config = require('../../config');
import { externalHooks } from '../Server';
export const credentialsController = express.Router();
/**
* Initialize Logger if needed
*/
credentialsController.use((req, res, next) => {
try {
LoggerProxy.getInstance();
} catch (error) {
LoggerProxy.init(getLogger());
}
next();
});
/**
* GET /credentials
*/
credentialsController.get(
'/',
ResponseHelper.send(async (req: CredentialRequest.GetAll): Promise<ICredentialsResponse[]> => {
let credentials: ICredentialsDb[] = [];
const filter = req.query.filter ? (JSON.parse(req.query.filter) as Record<string, string>) : {};
try {
if (req.user.globalRole.name === 'owner') {
credentials = await Db.collections.Credentials!.find({
select: ['id', 'name', 'type', 'nodesAccess', 'createdAt', 'updatedAt'],
where: filter,
});
} else {
const shared = await Db.collections.SharedCredentials!.find({
where: whereClause({
user: req.user,
entityType: 'credentials',
}),
});
if (!shared.length) return [];
credentials = await Db.collections.Credentials!.find({
select: ['id', 'name', 'type', 'nodesAccess', 'createdAt', 'updatedAt'],
where: {
id: In(shared.map(({ credentialId }) => credentialId)),
...filter,
},
});
}
} catch (error) {
LoggerProxy.error('Request to list credentials failed', error);
throw error;
}
return credentials.map((credential) => {
// eslint-disable-next-line no-param-reassign
credential.id = credential.id.toString();
return credential as ICredentialsResponse;
});
}),
);
/**
* GET /credentials/new
*
* Generate a unique credential name.
*/
credentialsController.get(
'/new',
ResponseHelper.send(async (req: CredentialRequest.NewName): Promise<{ name: string }> => {
const { name: newName } = req.query;
return GenericHelpers.generateUniqueName(
newName ?? config.get('credentials.defaultName'),
'credentials',
);
}),
);
/**
* POST /credentials/test
*
* Test if a credential is valid.
*/
credentialsController.post(
'/test',
ResponseHelper.send(async (req: CredentialRequest.Test): Promise<INodeCredentialTestResult> => {
const { credentials, nodeToTestWith } = req.body;
const encryptionKey = await UserSettings.getEncryptionKey();
if (!encryptionKey) {
return {
status: 'Error',
message: RESPONSE_ERROR_MESSAGES.NO_ENCRYPTION_KEY,
};
}
const helper = new CredentialsHelper(encryptionKey);
return helper.testCredentials(req.user, credentials.type, credentials, nodeToTestWith);
}),
);
/**
* POST /credentials
*/
credentialsController.post(
'/',
ResponseHelper.send(async (req: CredentialRequest.Create) => {
delete req.body.id; // delete if sent
const newCredential = new CredentialsEntity();
Object.assign(newCredential, req.body);
await validateEntity(newCredential);
// Add the added date for node access permissions
for (const nodeAccess of newCredential.nodesAccess) {
nodeAccess.date = new Date();
}
const encryptionKey = await UserSettings.getEncryptionKey();
if (!encryptionKey) {
throw new ResponseHelper.ResponseError(
RESPONSE_ERROR_MESSAGES.NO_ENCRYPTION_KEY,
undefined,
500,
);
}
// Encrypt the data
const coreCredential = new Credentials(
{ id: null, name: newCredential.name },
newCredential.type,
newCredential.nodesAccess,
);
// @ts-ignore
coreCredential.setData(newCredential.data, encryptionKey);
const encryptedData = coreCredential.getDataToSave() as ICredentialsDb;
Object.assign(newCredential, encryptedData);
await externalHooks.run('credentials.create', [encryptedData]);
const role = await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'credential',
});
const { id, ...rest } = await Db.transaction(async (transactionManager) => {
const savedCredential = await transactionManager.save<CredentialsEntity>(newCredential);
savedCredential.data = newCredential.data;
const newSharedCredential = new SharedCredentials();
Object.assign(newSharedCredential, {
role,
user: req.user,
credentials: savedCredential,
});
await transactionManager.save<SharedCredentials>(newSharedCredential);
return savedCredential;
});
LoggerProxy.verbose('New credential created', {
credentialId: newCredential.id,
ownerId: req.user.id,
});
return { id: id.toString(), ...rest };
}),
);
/**
* DELETE /credentials/:id
*/
credentialsController.delete(
'/:id',
ResponseHelper.send(async (req: CredentialRequest.Delete) => {
const { id: credentialId } = req.params;
const shared = await Db.collections.SharedCredentials!.findOne({
relations: ['credentials'],
where: whereClause({
user: req.user,
entityType: 'credentials',
entityId: credentialId,
}),
});
if (!shared) {
LoggerProxy.info('Attempt to delete credential blocked due to lack of permissions', {
credentialId,
userId: req.user.id,
});
throw new ResponseHelper.ResponseError(
`Credential with ID "${credentialId}" could not be found to be deleted.`,
undefined,
404,
);
}
await externalHooks.run('credentials.delete', [credentialId]);
await Db.collections.Credentials!.remove(shared.credentials);
return true;
}),
);
/**
* PATCH /credentials/:id
*/
credentialsController.patch(
'/:id',
ResponseHelper.send(async (req: CredentialRequest.Update): Promise<ICredentialsResponse> => {
const { id: credentialId } = req.params;
const updateData = new CredentialsEntity();
Object.assign(updateData, req.body);
await validateEntity(updateData);
const shared = await Db.collections.SharedCredentials!.findOne({
relations: ['credentials'],
where: whereClause({
user: req.user,
entityType: 'credentials',
entityId: credentialId,
}),
});
if (!shared) {
LoggerProxy.info('Attempt to update credential blocked due to lack of permissions', {
credentialId,
userId: req.user.id,
});
throw new ResponseHelper.ResponseError(
`Credential with ID "${credentialId}" could not be found to be updated.`,
undefined,
404,
);
}
const { credentials: credential } = shared;
// Add the date for newly added node access permissions
for (const nodeAccess of updateData.nodesAccess) {
if (!nodeAccess.date) {
nodeAccess.date = new Date();
}
}
const encryptionKey = await UserSettings.getEncryptionKey();
if (!encryptionKey) {
throw new ResponseHelper.ResponseError(
RESPONSE_ERROR_MESSAGES.NO_ENCRYPTION_KEY,
undefined,
500,
);
}
const coreCredential = new Credentials(
{ id: credential.id.toString(), name: credential.name },
credential.type,
credential.nodesAccess,
credential.data,
);
const decryptedData = coreCredential.getData(encryptionKey);
// Do not overwrite the oauth data else data like the access or refresh token would get lost
// everytime anybody changes anything on the credentials even if it is just the name.
if (decryptedData.oauthTokenData) {
// @ts-ignore
updateData.data.oauthTokenData = decryptedData.oauthTokenData;
}
// Encrypt the data
const credentials = new Credentials(
{ id: credentialId, name: updateData.name },
updateData.type,
updateData.nodesAccess,
);
// @ts-ignore
credentials.setData(updateData.data, encryptionKey);
const newCredentialData = credentials.getDataToSave() as ICredentialsDb;
// Add special database related data
newCredentialData.updatedAt = new Date();
await externalHooks.run('credentials.update', [newCredentialData]);
// Update the credentials in DB
await Db.collections.Credentials!.update(credentialId, newCredentialData);
// We sadly get nothing back from "update". Neither if it updated a record
// nor the new value. So query now the updated entry.
const responseData = await Db.collections.Credentials!.findOne(credentialId);
if (responseData === undefined) {
throw new ResponseHelper.ResponseError(
`Credential ID "${credentialId}" could not be found to be updated.`,
undefined,
404,
);
}
// Remove the encrypted data as it is not needed in the frontend
const { id, data, ...rest } = responseData;
LoggerProxy.verbose('Credential updated', { credentialId });
return {
id: id.toString(),
...rest,
};
}),
);
/**
* GET /credentials/:id
*/
credentialsController.get(
'/:id',
ResponseHelper.send(async (req: CredentialRequest.Get) => {
const { id: credentialId } = req.params;
const shared = await Db.collections.SharedCredentials!.findOne({
relations: ['credentials'],
where: whereClause({
user: req.user,
entityType: 'credentials',
entityId: credentialId,
}),
});
if (!shared) {
throw new ResponseHelper.ResponseError(
`Credentials with ID "${credentialId}" could not be found.`,
undefined,
404,
);
}
const { credentials: credential } = shared;
if (req.query.includeData !== 'true') {
const { data, id, ...rest } = credential;
return {
id: id.toString(),
...rest,
};
}
const { data, id, ...rest } = credential;
const encryptionKey = await UserSettings.getEncryptionKey();
if (!encryptionKey) {
throw new ResponseHelper.ResponseError(
RESPONSE_ERROR_MESSAGES.NO_ENCRYPTION_KEY,
undefined,
500,
);
}
const coreCredential = new Credentials(
{ id: credential.id.toString(), name: credential.name },
credential.type,
credential.nodesAccess,
credential.data,
);
return {
id: id.toString(),
data: coreCredential.getData(encryptionKey),
...rest,
};
}),
);

View file

@ -0,0 +1,8 @@
/* eslint-disable @typescript-eslint/naming-convention */
export const RESPONSE_ERROR_MESSAGES = {
NO_CREDENTIAL: 'Credential not found',
NO_ENCRYPTION_KEY: 'Encryption key missing to decrypt credentials',
};
export const AUTH_COOKIE_NAME = 'n8n-auth';

View file

@ -8,12 +8,15 @@ import {
CreateDateColumn, CreateDateColumn,
Entity, Entity,
Index, Index,
OneToMany,
PrimaryGeneratedColumn, PrimaryGeneratedColumn,
UpdateDateColumn, UpdateDateColumn,
} from 'typeorm'; } from 'typeorm';
import { IsArray, IsObject, IsString, Length } from 'class-validator';
import config = require('../../../config'); import config = require('../../../config');
import { DatabaseType, ICredentialsDb } from '../..'; import { DatabaseType, ICredentialsDb } from '../..';
import { SharedCredentials } from './SharedCredentials';
function resolveDataType(dataType: string) { function resolveDataType(dataType: string) {
const dbType = config.get('database.type') as DatabaseType; const dbType = config.get('database.type') as DatabaseType;
@ -51,21 +54,29 @@ export class CredentialsEntity implements ICredentialsDb {
@PrimaryGeneratedColumn() @PrimaryGeneratedColumn()
id: number; id: number;
@Column({ @Column({ length: 128 })
length: 128, @IsString({ message: 'Credential `name` must be of type string.' })
@Length(3, 128, {
message: 'Credential name must be $constraint1 to $constraint2 characters long.',
}) })
name: string; name: string;
@Column('text') @Column('text')
@IsObject()
data: string; data: string;
@Index() @Index()
@IsString({ message: 'Credential `type` must be of type string.' })
@Column({ @Column({
length: 128, length: 128,
}) })
type: string; type: string;
@OneToMany(() => SharedCredentials, (sharedCredentials) => sharedCredentials.credentials)
shared: SharedCredentials[];
@Column(resolveDataType('json')) @Column(resolveDataType('json'))
@IsArray()
nodesAccess: ICredentialNodeAccess[]; nodesAccess: ICredentialNodeAccess[];
@CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() }) @CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() })

View file

@ -0,0 +1,77 @@
/* eslint-disable import/no-cycle */
import {
BeforeUpdate,
Column,
CreateDateColumn,
Entity,
OneToMany,
PrimaryGeneratedColumn,
Unique,
UpdateDateColumn,
} from 'typeorm';
import { IsDate, IsOptional, IsString, Length } from 'class-validator';
import config = require('../../../config');
import { DatabaseType } from '../../index';
import { User } from './User';
import { SharedWorkflow } from './SharedWorkflow';
import { SharedCredentials } from './SharedCredentials';
type RoleScopes = 'global' | 'workflow' | 'credential';
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
function getTimestampSyntax() {
const dbType = config.get('database.type') as DatabaseType;
const map: { [key in DatabaseType]: string } = {
sqlite: "STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')",
postgresdb: 'CURRENT_TIMESTAMP(3)',
mysqldb: 'CURRENT_TIMESTAMP(3)',
mariadb: 'CURRENT_TIMESTAMP(3)',
};
return map[dbType];
}
@Entity()
@Unique(['scope', 'name'])
export class Role {
@PrimaryGeneratedColumn()
id: number;
@Column({ length: 32 })
@IsString({ message: 'Role name must be of type string.' })
@Length(1, 32, { message: 'Role name must be 1 to 32 characters long.' })
name: string;
@Column()
scope: RoleScopes;
@OneToMany(() => User, (user) => user.globalRole)
globalForUsers: User[];
@CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() })
@IsOptional() // ignored by validation because set at DB level
@IsDate()
createdAt: Date;
@UpdateDateColumn({
precision: 3,
default: () => getTimestampSyntax(),
onUpdate: getTimestampSyntax(),
})
@IsOptional() // ignored by validation because set at DB level
@IsDate()
updatedAt: Date;
@OneToMany(() => SharedWorkflow, (sharedWorkflow) => sharedWorkflow.role)
sharedWorkflows: SharedWorkflow[];
@OneToMany(() => SharedCredentials, (sharedCredentials) => sharedCredentials.role)
sharedCredentials: SharedCredentials[];
@BeforeUpdate()
setUpdateDate(): void {
this.updatedAt = new Date();
}
}

View file

@ -0,0 +1,18 @@
/* eslint-disable @typescript-eslint/explicit-module-boundary-types */
/* eslint-disable import/no-cycle */
import { Column, Entity, PrimaryColumn } from 'typeorm';
import { ISettingsDb } from '../..';
@Entity()
export class Settings implements ISettingsDb {
@PrimaryColumn()
key: string;
@Column()
value: string;
@Column()
loadOnStartup: boolean;
}

View file

@ -0,0 +1,70 @@
/* eslint-disable import/no-cycle */
import {
BeforeUpdate,
CreateDateColumn,
Entity,
ManyToOne,
RelationId,
UpdateDateColumn,
} from 'typeorm';
import { IsDate, IsOptional } from 'class-validator';
import config = require('../../../config');
import { DatabaseType } from '../../index';
import { CredentialsEntity } from './CredentialsEntity';
import { User } from './User';
import { Role } from './Role';
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
function getTimestampSyntax() {
const dbType = config.get('database.type') as DatabaseType;
const map: { [key in DatabaseType]: string } = {
sqlite: "STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')",
postgresdb: 'CURRENT_TIMESTAMP(3)',
mysqldb: 'CURRENT_TIMESTAMP(3)',
mariadb: 'CURRENT_TIMESTAMP(3)',
};
return map[dbType];
}
@Entity()
export class SharedCredentials {
@ManyToOne(() => Role, (role) => role.sharedCredentials, { nullable: false })
role: Role;
@ManyToOne(() => User, (user) => user.sharedCredentials, { primary: true })
user: User;
@RelationId((sharedCredential: SharedCredentials) => sharedCredential.user)
userId: string;
@ManyToOne(() => CredentialsEntity, (credentials) => credentials.shared, {
primary: true,
onDelete: 'CASCADE',
})
credentials: CredentialsEntity;
@RelationId((sharedCredential: SharedCredentials) => sharedCredential.credentials)
credentialId: number;
@CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() })
@IsOptional() // ignored by validation because set at DB level
@IsDate()
createdAt: Date;
@UpdateDateColumn({
precision: 3,
default: () => getTimestampSyntax(),
onUpdate: getTimestampSyntax(),
})
@IsOptional() // ignored by validation because set at DB level
@IsDate()
updatedAt: Date;
@BeforeUpdate()
setUpdateDate(): void {
this.updatedAt = new Date();
}
}

View file

@ -0,0 +1,70 @@
/* eslint-disable import/no-cycle */
import {
BeforeUpdate,
CreateDateColumn,
Entity,
ManyToOne,
RelationId,
UpdateDateColumn,
} from 'typeorm';
import { IsDate, IsOptional } from 'class-validator';
import config = require('../../../config');
import { DatabaseType } from '../../index';
import { WorkflowEntity } from './WorkflowEntity';
import { User } from './User';
import { Role } from './Role';
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
function getTimestampSyntax() {
const dbType = config.get('database.type') as DatabaseType;
const map: { [key in DatabaseType]: string } = {
sqlite: "STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')",
postgresdb: 'CURRENT_TIMESTAMP(3)',
mysqldb: 'CURRENT_TIMESTAMP(3)',
mariadb: 'CURRENT_TIMESTAMP(3)',
};
return map[dbType];
}
@Entity()
export class SharedWorkflow {
@ManyToOne(() => Role, (role) => role.sharedWorkflows, { nullable: false })
role: Role;
@ManyToOne(() => User, (user) => user.sharedWorkflows, { primary: true })
user: User;
@RelationId((sharedWorkflow: SharedWorkflow) => sharedWorkflow.user)
userId: string;
@ManyToOne(() => WorkflowEntity, (workflow) => workflow.shared, {
primary: true,
onDelete: 'CASCADE',
})
workflow: WorkflowEntity;
@RelationId((sharedWorkflow: SharedWorkflow) => sharedWorkflow.workflow)
workflowId: number;
@CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() })
@IsOptional() // ignored by validation because set at DB level
@IsDate()
createdAt: Date;
@UpdateDateColumn({
precision: 3,
default: () => getTimestampSyntax(),
onUpdate: getTimestampSyntax(),
})
@IsOptional() // ignored by validation because set at DB level
@IsDate()
updatedAt: Date;
@BeforeUpdate()
setUpdateDate(): void {
this.updatedAt = new Date();
}
}

View file

@ -5,9 +5,10 @@ import {
Column, Column,
CreateDateColumn, CreateDateColumn,
Entity, Entity,
Generated,
Index, Index,
ManyToMany, ManyToMany,
PrimaryGeneratedColumn, PrimaryColumn,
UpdateDateColumn, UpdateDateColumn,
} from 'typeorm'; } from 'typeorm';
import { IsDate, IsOptional, IsString, Length } from 'class-validator'; import { IsDate, IsOptional, IsString, Length } from 'class-validator';
@ -15,6 +16,7 @@ import { IsDate, IsOptional, IsString, Length } from 'class-validator';
import config = require('../../../config'); import config = require('../../../config');
import { DatabaseType } from '../../index'; import { DatabaseType } from '../../index';
import { ITagDb } from '../../Interfaces'; import { ITagDb } from '../../Interfaces';
import { idStringifier } from '../utils/transformers';
import { WorkflowEntity } from './WorkflowEntity'; import { WorkflowEntity } from './WorkflowEntity';
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types // eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
@ -33,13 +35,16 @@ function getTimestampSyntax() {
@Entity() @Entity()
export class TagEntity implements ITagDb { export class TagEntity implements ITagDb {
@PrimaryGeneratedColumn() @Generated()
@PrimaryColumn({
transformer: idStringifier,
})
id: number; id: number;
@Column({ length: 24 }) @Column({ length: 24 })
@Index({ unique: true }) @Index({ unique: true })
@IsString({ message: 'Tag name must be of type string.' }) @IsString({ message: 'Tag name must be of type string.' })
@Length(1, 24, { message: 'Tag name must be 1 to 24 characters long.' }) @Length(1, 24, { message: 'Tag name must be $constraint1 to $constraint2 characters long.' })
name: string; name: string;
@CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() }) @CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() })

View file

@ -0,0 +1,137 @@
/* eslint-disable import/no-cycle */
import {
AfterLoad,
AfterUpdate,
BeforeUpdate,
Column,
ColumnOptions,
CreateDateColumn,
Entity,
Index,
OneToMany,
ManyToOne,
PrimaryGeneratedColumn,
UpdateDateColumn,
} from 'typeorm';
import { IsEmail, IsString, Length } from 'class-validator';
import config = require('../../../config');
import { DatabaseType, IPersonalizationSurveyAnswers } from '../..';
import { Role } from './Role';
import { SharedWorkflow } from './SharedWorkflow';
import { SharedCredentials } from './SharedCredentials';
import { NoXss } from '../utils/customValidators';
import { answersFormatter } from '../utils/transformers';
export const MIN_PASSWORD_LENGTH = 8;
export const MAX_PASSWORD_LENGTH = 64;
function resolveDataType(dataType: string) {
const dbType = config.get('database.type') as DatabaseType;
const typeMap: { [key in DatabaseType]: { [key: string]: string } } = {
sqlite: {
json: 'simple-json',
},
postgresdb: {
datetime: 'timestamptz',
},
mysqldb: {},
mariadb: {},
};
return typeMap[dbType][dataType] ?? dataType;
}
// eslint-disable-next-line @typescript-eslint/explicit-module-boundary-types
function getTimestampSyntax() {
const dbType = config.get('database.type') as DatabaseType;
const map: { [key in DatabaseType]: string } = {
sqlite: "STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')",
postgresdb: 'CURRENT_TIMESTAMP(3)',
mysqldb: 'CURRENT_TIMESTAMP(3)',
mariadb: 'CURRENT_TIMESTAMP(3)',
};
return map[dbType];
}
@Entity()
export class User {
@PrimaryGeneratedColumn('uuid')
id: string;
@Column({ length: 254 })
@Index({ unique: true })
@IsEmail()
email: string;
@Column({ length: 32, nullable: true })
@NoXss()
@IsString({ message: 'First name must be of type string.' })
@Length(1, 32, { message: 'First name must be $constraint1 to $constraint2 characters long.' })
firstName: string;
@Column({ length: 32, nullable: true })
@NoXss()
@IsString({ message: 'Last name must be of type string.' })
@Length(1, 32, { message: 'Last name must be $constraint1 to $constraint2 characters long.' })
lastName: string;
@Column({ nullable: true })
@IsString({ message: 'Password must be of type string.' })
password?: string;
@Column({ type: String, nullable: true })
resetPasswordToken?: string | null;
// Expiration timestamp saved in seconds
@Column({ type: Number, nullable: true })
resetPasswordTokenExpiration?: number | null;
@Column({
type: resolveDataType('json') as ColumnOptions['type'],
nullable: true,
transformer: answersFormatter,
})
personalizationAnswers: IPersonalizationSurveyAnswers | null;
@ManyToOne(() => Role, (role) => role.globalForUsers, {
cascade: true,
nullable: false,
})
globalRole: Role;
@OneToMany(() => SharedWorkflow, (sharedWorkflow) => sharedWorkflow.user)
sharedWorkflows: SharedWorkflow[];
@OneToMany(() => SharedCredentials, (sharedCredentials) => sharedCredentials.user)
sharedCredentials: SharedCredentials[];
@CreateDateColumn({ precision: 3, default: () => getTimestampSyntax() })
createdAt: Date;
@UpdateDateColumn({
precision: 3,
default: () => getTimestampSyntax(),
onUpdate: getTimestampSyntax(),
})
updatedAt: Date;
@BeforeUpdate()
setUpdateDate(): void {
this.updatedAt = new Date();
}
/**
* Whether the user is pending setup completion.
*/
isPending: boolean;
@AfterLoad()
@AfterUpdate()
computeIsPending(): void {
this.isPending = this.password == null;
}
}

View file

@ -13,6 +13,7 @@ import {
Index, Index,
JoinTable, JoinTable,
ManyToMany, ManyToMany,
OneToMany,
PrimaryGeneratedColumn, PrimaryGeneratedColumn,
UpdateDateColumn, UpdateDateColumn,
} from 'typeorm'; } from 'typeorm';
@ -20,6 +21,7 @@ import {
import config = require('../../../config'); import config = require('../../../config');
import { DatabaseType, IWorkflowDb } from '../..'; import { DatabaseType, IWorkflowDb } from '../..';
import { TagEntity } from './TagEntity'; import { TagEntity } from './TagEntity';
import { SharedWorkflow } from './SharedWorkflow';
function resolveDataType(dataType: string) { function resolveDataType(dataType: string) {
const dbType = config.get('database.type') as DatabaseType; const dbType = config.get('database.type') as DatabaseType;
@ -57,8 +59,11 @@ export class WorkflowEntity implements IWorkflowDb {
@PrimaryGeneratedColumn() @PrimaryGeneratedColumn()
id: number; id: number;
// TODO: Add XSS check
@Index({ unique: true }) @Index({ unique: true })
@Length(1, 128, { message: 'Workflow name must be 1 to 128 characters long.' }) @Length(1, 128, {
message: 'Workflow name must be $constraint1 to $constraint2 characters long.',
})
@Column({ length: 128 }) @Column({ length: 128 })
name: string; name: string;
@ -107,6 +112,9 @@ export class WorkflowEntity implements IWorkflowDb {
}) })
tags: TagEntity[]; tags: TagEntity[];
@OneToMany(() => SharedWorkflow, (sharedWorkflow) => sharedWorkflow.workflow)
shared: SharedWorkflow[];
@BeforeUpdate() @BeforeUpdate()
setUpdateDate() { setUpdateDate() {
this.updatedAt = new Date(); this.updatedAt = new Date();

View file

@ -5,6 +5,11 @@ import { ExecutionEntity } from './ExecutionEntity';
import { WorkflowEntity } from './WorkflowEntity'; import { WorkflowEntity } from './WorkflowEntity';
import { WebhookEntity } from './WebhookEntity'; import { WebhookEntity } from './WebhookEntity';
import { TagEntity } from './TagEntity'; import { TagEntity } from './TagEntity';
import { User } from './User';
import { Role } from './Role';
import { Settings } from './Settings';
import { SharedWorkflow } from './SharedWorkflow';
import { SharedCredentials } from './SharedCredentials';
export const entities = { export const entities = {
CredentialsEntity, CredentialsEntity,
@ -12,4 +17,9 @@ export const entities = {
WorkflowEntity, WorkflowEntity,
WebhookEntity, WebhookEntity,
TagEntity, TagEntity,
User,
Role,
Settings,
SharedWorkflow,
SharedCredentials,
}; };

View file

@ -6,7 +6,6 @@ export class AddWaitColumnId1626183952959 implements MigrationInterface {
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
console.log('\n\nINFO: Started with migration for wait functionality.\n Depending on the number of saved executions, that may take a little bit.\n\n');
await queryRunner.query('ALTER TABLE `' + tablePrefix + 'execution_entity` ADD `waitTill` DATETIME NULL'); await queryRunner.query('ALTER TABLE `' + tablePrefix + 'execution_entity` ADD `waitTill` DATETIME NULL');
await queryRunner.query('CREATE INDEX `IDX_' + tablePrefix + 'ca4a71b47f28ac6ea88293a8e2` ON `' + tablePrefix + 'execution_entity` (`waitTill`)'); await queryRunner.query('CREATE INDEX `IDX_' + tablePrefix + 'ca4a71b47f28ac6ea88293a8e2` ON `' + tablePrefix + 'execution_entity` (`waitTill`)');

View file

@ -9,8 +9,6 @@ export class UpdateWorkflowCredentials1630451444017 implements MigrationInterfac
name = 'UpdateWorkflowCredentials1630451444017'; name = 'UpdateWorkflowCredentials1630451444017';
public async up(queryRunner: QueryRunner): Promise<void> { public async up(queryRunner: QueryRunner): Promise<void> {
console.log('Start migration', this.name);
console.time(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
const helpers = new MigrationHelpers(queryRunner); const helpers = new MigrationHelpers(queryRunner);
@ -145,7 +143,6 @@ export class UpdateWorkflowCredentials1630451444017 implements MigrationInterfac
queryRunner.query(updateQuery, updateParams); queryRunner.query(updateQuery, updateParams);
} }
}); });
console.timeEnd(this.name);
} }
public async down(queryRunner: QueryRunner): Promise<void> { public async down(queryRunner: QueryRunner): Promise<void> {

View file

@ -2,31 +2,70 @@ import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
export class AddExecutionEntityIndexes1644424784709 implements MigrationInterface { export class AddExecutionEntityIndexes1644424784709 implements MigrationInterface {
name = 'AddExecutionEntityIndexes1644424784709' name = 'AddExecutionEntityIndexes1644424784709';
public async up(queryRunner: QueryRunner): Promise<void> { public async up(queryRunner: QueryRunner): Promise<void> {
console.log('\n\nINFO: Started migration for execution entity indexes.\n Depending on the number of saved executions, it may take a while.\n\n'); const tablePrefix = config.get('database.tablePrefix');
const tablePrefix = config.get('database.tablePrefix'); await queryRunner.query(
'DROP INDEX `IDX_c4d999a5e90784e8caccf5589d` ON `' + tablePrefix + 'execution_entity`',
await queryRunner.query('DROP INDEX `IDX_c4d999a5e90784e8caccf5589d` ON `' + tablePrefix + 'execution_entity`'); );
await queryRunner.query('DROP INDEX `IDX_ca4a71b47f28ac6ea88293a8e2` ON `' + tablePrefix + 'execution_entity`'); await queryRunner.query(
await queryRunner.query('CREATE INDEX `IDX_06da892aaf92a48e7d3e400003` ON `' + tablePrefix + 'execution_entity` (`workflowId`, `waitTill`, `id`)'); 'DROP INDEX `IDX_ca4a71b47f28ac6ea88293a8e2` ON `' + tablePrefix + 'execution_entity`',
await queryRunner.query('CREATE INDEX `IDX_78d62b89dc1433192b86dce18a` ON `' + tablePrefix + 'execution_entity` (`workflowId`, `finished`, `id`)'); );
await queryRunner.query('CREATE INDEX `IDX_1688846335d274033e15c846a4` ON `' + tablePrefix + 'execution_entity` (`finished`, `id`)'); await queryRunner.query(
await queryRunner.query('CREATE INDEX `IDX_b94b45ce2c73ce46c54f20b5f9` ON `' + tablePrefix + 'execution_entity` (`waitTill`, `id`)'); 'CREATE INDEX `IDX_06da892aaf92a48e7d3e400003` ON `' +
await queryRunner.query('CREATE INDEX `IDX_81fc04c8a17de15835713505e4` ON `' + tablePrefix + 'execution_entity` (`workflowId`, `id`)'); tablePrefix +
} 'execution_entity` (`workflowId`, `waitTill`, `id`)',
);
public async down(queryRunner: QueryRunner): Promise<void> { await queryRunner.query(
const tablePrefix = config.get('database.tablePrefix'); 'CREATE INDEX `IDX_78d62b89dc1433192b86dce18a` ON `' +
await queryRunner.query('DROP INDEX `IDX_81fc04c8a17de15835713505e4` ON `' + tablePrefix + 'execution_entity`'); tablePrefix +
await queryRunner.query('DROP INDEX `IDX_b94b45ce2c73ce46c54f20b5f9` ON `' + tablePrefix + 'execution_entity`'); 'execution_entity` (`workflowId`, `finished`, `id`)',
await queryRunner.query('DROP INDEX `IDX_1688846335d274033e15c846a4` ON `' + tablePrefix + 'execution_entity`'); );
await queryRunner.query('DROP INDEX `IDX_78d62b89dc1433192b86dce18a` ON `' + tablePrefix + 'execution_entity`'); await queryRunner.query(
await queryRunner.query('DROP INDEX `IDX_06da892aaf92a48e7d3e400003` ON `' + tablePrefix + 'execution_entity`'); 'CREATE INDEX `IDX_1688846335d274033e15c846a4` ON `' +
await queryRunner.query('CREATE INDEX `IDX_ca4a71b47f28ac6ea88293a8e2` ON `' + tablePrefix + 'execution_entity` (`waitTill`)'); tablePrefix +
await queryRunner.query('CREATE INDEX `IDX_c4d999a5e90784e8caccf5589d` ON `' + tablePrefix + 'execution_entity` (`workflowId`)'); 'execution_entity` (`finished`, `id`)',
} );
await queryRunner.query(
'CREATE INDEX `IDX_b94b45ce2c73ce46c54f20b5f9` ON `' +
tablePrefix +
'execution_entity` (`waitTill`, `id`)',
);
await queryRunner.query(
'CREATE INDEX `IDX_81fc04c8a17de15835713505e4` ON `' +
tablePrefix +
'execution_entity` (`workflowId`, `id`)',
);
}
public async down(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(
'DROP INDEX `IDX_81fc04c8a17de15835713505e4` ON `' + tablePrefix + 'execution_entity`',
);
await queryRunner.query(
'DROP INDEX `IDX_b94b45ce2c73ce46c54f20b5f9` ON `' + tablePrefix + 'execution_entity`',
);
await queryRunner.query(
'DROP INDEX `IDX_1688846335d274033e15c846a4` ON `' + tablePrefix + 'execution_entity`',
);
await queryRunner.query(
'DROP INDEX `IDX_78d62b89dc1433192b86dce18a` ON `' + tablePrefix + 'execution_entity`',
);
await queryRunner.query(
'DROP INDEX `IDX_06da892aaf92a48e7d3e400003` ON `' + tablePrefix + 'execution_entity`',
);
await queryRunner.query(
'CREATE INDEX `IDX_ca4a71b47f28ac6ea88293a8e2` ON `' +
tablePrefix +
'execution_entity` (`waitTill`)',
);
await queryRunner.query(
'CREATE INDEX `IDX_c4d999a5e90784e8caccf5589d` ON `' +
tablePrefix +
'execution_entity` (`workflowId`)',
);
}
} }

View file

@ -0,0 +1,171 @@
import { MigrationInterface, QueryRunner } from 'typeorm';
import { v4 as uuid } from 'uuid';
import config = require('../../../../config');
import { loadSurveyFromDisk } from '../../utils/migrationHelpers';
export class CreateUserManagement1646992772331 implements MigrationInterface {
name = 'CreateUserManagement1646992772331';
public async up(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(
`CREATE TABLE ${tablePrefix}role (
\`id\` int NOT NULL AUTO_INCREMENT,
\`name\` varchar(32) NOT NULL,
\`scope\` varchar(255) NOT NULL,
\`createdAt\` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP,
\`updatedAt\` datetime NOT NULL DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (\`id\`),
UNIQUE KEY \`UQ_${tablePrefix}5b49d0f504f7ef31045a1fb2eb8\` (\`scope\`,\`name\`)
) ENGINE=InnoDB;`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}user (
\`id\` VARCHAR(36) NOT NULL,
\`email\` VARCHAR(255) NULL DEFAULT NULL,
\`firstName\` VARCHAR(32) NULL DEFAULT NULL,
\`lastName\` VARCHAR(32) NULL DEFAULT NULL,
\`password\` VARCHAR(255) NULL DEFAULT NULL,
\`resetPasswordToken\` VARCHAR(255) NULL DEFAULT NULL,
\`resetPasswordTokenExpiration\` INT NULL DEFAULT NULL,
\`personalizationAnswers\` TEXT NULL DEFAULT NULL,
\`createdAt\` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
\`updatedAt\` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
\`globalRoleId\` INT NOT NULL,
PRIMARY KEY (\`id\`),
UNIQUE INDEX \`IDX_${tablePrefix}e12875dfb3b1d92d7d7c5377e2\` (\`email\` ASC),
INDEX \`FK_${tablePrefix}f0609be844f9200ff4365b1bb3d\` (\`globalRoleId\` ASC)
) ENGINE=InnoDB;`,
);
await queryRunner.query(
`ALTER TABLE \`${tablePrefix}user\` ADD CONSTRAINT \`FK_${tablePrefix}f0609be844f9200ff4365b1bb3d\` FOREIGN KEY (\`globalRoleId\`) REFERENCES \`${tablePrefix}role\`(\`id\`) ON DELETE CASCADE ON UPDATE NO ACTION`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}shared_workflow (
\`createdAt\` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
\`updatedAt\` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
\`roleId\` INT NOT NULL,
\`userId\` VARCHAR(36) NOT NULL,
\`workflowId\` INT NOT NULL,
INDEX \`FK_${tablePrefix}3540da03964527aa24ae014b780x\` (\`roleId\` ASC),
INDEX \`FK_${tablePrefix}82b2fd9ec4e3e24209af8160282x\` (\`userId\` ASC),
INDEX \`FK_${tablePrefix}b83f8d2530884b66a9c848c8b88x\` (\`workflowId\` ASC),
PRIMARY KEY (\`userId\`, \`workflowId\`)
) ENGINE=InnoDB;`,
);
await queryRunner.query(
`ALTER TABLE \`${tablePrefix}shared_workflow\` ADD CONSTRAINT \`FK_${tablePrefix}3540da03964527aa24ae014b780\` FOREIGN KEY (\`roleId\`) REFERENCES \`${tablePrefix}role\`(\`id\`) ON DELETE CASCADE ON UPDATE NO ACTION`,
);
await queryRunner.query(
`ALTER TABLE \`${tablePrefix}shared_workflow\` ADD CONSTRAINT \`FK_${tablePrefix}82b2fd9ec4e3e24209af8160282\` FOREIGN KEY (\`userId\`) REFERENCES \`${tablePrefix}user\`(\`id\`) ON DELETE CASCADE ON UPDATE NO ACTION`,
);
await queryRunner.query(
`ALTER TABLE \`${tablePrefix}shared_workflow\` ADD CONSTRAINT \`FK_${tablePrefix}b83f8d2530884b66a9c848c8b88\` FOREIGN KEY (\`workflowId\`) REFERENCES \`${tablePrefix}workflow_entity\`(\`id\`) ON DELETE CASCADE ON UPDATE NO ACTION`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}shared_credentials (
\`createdAt\` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
\`updatedAt\` DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
\`roleId\` INT NOT NULL,
\`userId\` VARCHAR(36) NOT NULL,
\`credentialsId\` INT NOT NULL,
INDEX \`FK_${tablePrefix}c68e056637562000b68f480815a\` (\`roleId\` ASC),
INDEX \`FK_${tablePrefix}484f0327e778648dd04f1d70493\` (\`userId\` ASC),
INDEX \`FK_${tablePrefix}68661def1d4bcf2451ac8dbd949\` (\`credentialsId\` ASC),
PRIMARY KEY (\`userId\`, \`credentialsId\`)
) ENGINE=InnoDB;`,
);
await queryRunner.query(
`ALTER TABLE \`${tablePrefix}shared_credentials\` ADD CONSTRAINT \`FK_${tablePrefix}484f0327e778648dd04f1d70493\` FOREIGN KEY (\`userId\`) REFERENCES \`${tablePrefix}user\`(\`id\`) ON DELETE CASCADE ON UPDATE NO ACTION`,
);
await queryRunner.query(
`ALTER TABLE \`${tablePrefix}shared_credentials\` ADD CONSTRAINT \`FK_${tablePrefix}68661def1d4bcf2451ac8dbd949\` FOREIGN KEY (\`credentialsId\`) REFERENCES \`${tablePrefix}credentials_entity\`(\`id\`) ON DELETE CASCADE ON UPDATE NO ACTION`,
);
await queryRunner.query(
`ALTER TABLE \`${tablePrefix}shared_credentials\` ADD CONSTRAINT \`FK_${tablePrefix}c68e056637562000b68f480815a\` FOREIGN KEY (\`roleId\`) REFERENCES \`${tablePrefix}role\`(\`id\`) ON DELETE CASCADE ON UPDATE NO ACTION`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}settings (
\`key\` VARCHAR(255) NOT NULL,
\`value\` TEXT NOT NULL,
\`loadOnStartup\` TINYINT(1) NOT NULL DEFAULT 0,
PRIMARY KEY (\`key\`)
) ENGINE=InnoDB;`,
);
await queryRunner.query(
`ALTER TABLE ${tablePrefix}workflow_entity DROP INDEX IDX_${tablePrefix}943d8f922be094eb507cb9a7f9`,
);
// Insert initial roles
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ("owner", "global");`,
);
const instanceOwnerRole = await queryRunner.query('SELECT LAST_INSERT_ID() as insertId');
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ("member", "global");`,
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ("owner", "workflow");`,
);
const workflowOwnerRole = await queryRunner.query('SELECT LAST_INSERT_ID() as insertId');
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ("owner", "credential");`,
);
const credentialOwnerRole = await queryRunner.query('SELECT LAST_INSERT_ID() as insertId');
const survey = loadSurveyFromDisk();
const ownerUserId = uuid();
await queryRunner.query(
`INSERT INTO ${tablePrefix}user (id, globalRoleId, personalizationAnswers) values (?, ?, ?)`,
[ownerUserId, instanceOwnerRole[0].insertId, survey],
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}shared_workflow (createdAt, updatedAt, roleId, userId, workflowId) select
NOW(), NOW(), '${workflowOwnerRole[0].insertId}', '${ownerUserId}', id FROM ${tablePrefix}workflow_entity`,
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}shared_credentials (createdAt, updatedAt, roleId, userId, credentialsId) SELECT NOW(), NOW(), '${credentialOwnerRole[0].insertId}', '${ownerUserId}', id FROM ${tablePrefix} credentials_entity`,
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}settings (\`key\`, value, loadOnStartup) VALUES ("userManagement.isInstanceOwnerSetUp", "false", 1), ("userManagement.skipInstanceOwnerSetup", "false", 1)`,
);
}
public async down(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(
`ALTER TABLE ${tablePrefix}workflow_entity ADD UNIQUE INDEX \`IDX_${tablePrefix}943d8f922be094eb507cb9a7f9\` (\`name\`)`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}shared_credentials"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}shared_workflow"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}user"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}role"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}settings"`);
}
}

View file

@ -11,6 +11,7 @@ import { CertifyCorrectCollation1623936588000 } from './1623936588000-CertifyCor
import { AddWaitColumnId1626183952959 } from './1626183952959-AddWaitColumn'; import { AddWaitColumnId1626183952959 } from './1626183952959-AddWaitColumn';
import { UpdateWorkflowCredentials1630451444017 } from './1630451444017-UpdateWorkflowCredentials'; import { UpdateWorkflowCredentials1630451444017 } from './1630451444017-UpdateWorkflowCredentials';
import { AddExecutionEntityIndexes1644424784709 } from './1644424784709-AddExecutionEntityIndexes'; import { AddExecutionEntityIndexes1644424784709 } from './1644424784709-AddExecutionEntityIndexes';
import { CreateUserManagement1646992772331 } from './1646992772331-CreateUserManagement';
export const mysqlMigrations = [ export const mysqlMigrations = [
InitialMigration1588157391238, InitialMigration1588157391238,
@ -26,4 +27,5 @@ export const mysqlMigrations = [
AddWaitColumnId1626183952959, AddWaitColumnId1626183952959,
UpdateWorkflowCredentials1630451444017, UpdateWorkflowCredentials1630451444017,
AddExecutionEntityIndexes1644424784709, AddExecutionEntityIndexes1644424784709,
CreateUserManagement1646992772331,
]; ];

View file

@ -1,59 +1,70 @@
import {MigrationInterface, QueryRunner} from "typeorm"; import { MigrationInterface, QueryRunner } from 'typeorm';
import config = require("../../../../config"); import config = require('../../../../config');
export class UniqueWorkflowNames1620824779533 implements MigrationInterface { export class UniqueWorkflowNames1620824779533 implements MigrationInterface {
name = 'UniqueWorkflowNames1620824779533'; name = 'UniqueWorkflowNames1620824779533';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
let tablePrefix = config.get('database.tablePrefix'); let tablePrefix = config.get('database.tablePrefix');
const tablePrefixPure = tablePrefix; const tablePrefixPure = tablePrefix;
const schema = config.get('database.postgresdb.schema'); const schema = config.get('database.postgresdb.schema');
if (schema) { if (schema) {
tablePrefix = schema + '.' + tablePrefix; tablePrefix = schema + '.' + tablePrefix;
} }
const workflowNames = await queryRunner.query(` const workflowNames = await queryRunner.query(`
SELECT name SELECT name
FROM ${tablePrefix}workflow_entity FROM ${tablePrefix}workflow_entity
`); `);
for (const { name } of workflowNames) { for (const { name } of workflowNames) {
const [duplicatesQuery, parameters] = queryRunner.connection.driver.escapeQueryWithParameters(` const [duplicatesQuery, parameters] = queryRunner.connection.driver.escapeQueryWithParameters(
`
SELECT id, name SELECT id, name
FROM ${tablePrefix}workflow_entity FROM ${tablePrefix}workflow_entity
WHERE name = :name WHERE name = :name
ORDER BY "createdAt" ASC ORDER BY "createdAt" ASC
`, { name }, {}); `,
{ name },
{},
);
const duplicates = await queryRunner.query(duplicatesQuery, parameters); const duplicates = await queryRunner.query(duplicatesQuery, parameters);
if (duplicates.length > 1) { if (duplicates.length > 1) {
await Promise.all(duplicates.map(({ id, name }: { id: number; name: string; }, index: number) => { await Promise.all(
duplicates.map(({ id, name }: { id: number; name: string }, index: number) => {
if (index === 0) return Promise.resolve(); if (index === 0) return Promise.resolve();
const [updateQuery, updateParams] = queryRunner.connection.driver.escapeQueryWithParameters(` const [updateQuery, updateParams] =
queryRunner.connection.driver.escapeQueryWithParameters(
`
UPDATE ${tablePrefix}workflow_entity UPDATE ${tablePrefix}workflow_entity
SET name = :name SET name = :name
WHERE id = '${id}' WHERE id = '${id}'
`, { name: `${name} ${index + 1}`}, {}); `,
{ name: `${name} ${index + 1}` },
{},
);
return queryRunner.query(updateQuery, updateParams); return queryRunner.query(updateQuery, updateParams);
})); }),
} );
} }
await queryRunner.query(`CREATE UNIQUE INDEX "IDX_${tablePrefixPure}a252c527c4c89237221fe2c0ab" ON ${tablePrefix}workflow_entity ("name") `);
} }
async down(queryRunner: QueryRunner): Promise<void> { await queryRunner.query(
let tablePrefix = config.get('database.tablePrefix'); `CREATE UNIQUE INDEX "IDX_${tablePrefixPure}a252c527c4c89237221fe2c0ab" ON ${tablePrefix}workflow_entity ("name") `,
const tablePrefixPure = tablePrefix; );
const schema = config.get('database.postgresdb.schema'); }
if (schema) {
tablePrefix = schema + '.' + tablePrefix;
}
await queryRunner.query(`DROP INDEX "public"."IDX_${tablePrefixPure}a252c527c4c89237221fe2c0ab"`);
async down(queryRunner: QueryRunner): Promise<void> {
let tablePrefix = config.get('database.tablePrefix');
const tablePrefixPure = tablePrefix;
const schema = config.get('database.postgresdb.schema');
if (schema) {
tablePrefix = schema + '.' + tablePrefix;
} }
await queryRunner.query(`DROP INDEX "IDX_${tablePrefixPure}a252c527c4c89237221fe2c0ab"`);
}
} }

View file

@ -11,7 +11,6 @@ export class AddwaitTill1626176912946 implements MigrationInterface {
if (schema) { if (schema) {
tablePrefix = schema + '.' + tablePrefix; tablePrefix = schema + '.' + tablePrefix;
} }
console.log('\n\nINFO: Started with migration for wait functionality.\n Depending on the number of saved executions, that may take a little bit.\n\n');
await queryRunner.query(`ALTER TABLE ${tablePrefix}execution_entity ADD "waitTill" TIMESTAMP`); await queryRunner.query(`ALTER TABLE ${tablePrefix}execution_entity ADD "waitTill" TIMESTAMP`);
await queryRunner.query(`CREATE INDEX IF NOT EXISTS IDX_${tablePrefixPure}ca4a71b47f28ac6ea88293a8e2 ON ${tablePrefix}execution_entity ("waitTill")`); await queryRunner.query(`CREATE INDEX IF NOT EXISTS IDX_${tablePrefixPure}ca4a71b47f28ac6ea88293a8e2 ON ${tablePrefix}execution_entity ("waitTill")`);

View file

@ -9,8 +9,6 @@ export class UpdateWorkflowCredentials1630419189837 implements MigrationInterfac
name = 'UpdateWorkflowCredentials1630419189837'; name = 'UpdateWorkflowCredentials1630419189837';
public async up(queryRunner: QueryRunner): Promise<void> { public async up(queryRunner: QueryRunner): Promise<void> {
console.log('Start migration', this.name);
console.time(this.name);
let tablePrefix = config.get('database.tablePrefix'); let tablePrefix = config.get('database.tablePrefix');
const schema = config.get('database.postgresdb.schema'); const schema = config.get('database.postgresdb.schema');
if (schema) { if (schema) {
@ -151,7 +149,6 @@ export class UpdateWorkflowCredentials1630419189837 implements MigrationInterfac
queryRunner.query(updateQuery, updateParams); queryRunner.query(updateQuery, updateParams);
} }
}); });
console.timeEnd(this.name);
} }
public async down(queryRunner: QueryRunner): Promise<void> { public async down(queryRunner: QueryRunner): Promise<void> {

View file

@ -2,46 +2,75 @@ import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
export class AddExecutionEntityIndexes1644422880309 implements MigrationInterface { export class AddExecutionEntityIndexes1644422880309 implements MigrationInterface {
name = 'AddExecutionEntityIndexes1644422880309' name = 'AddExecutionEntityIndexes1644422880309';
public async up(queryRunner: QueryRunner): Promise<void> { public async up(queryRunner: QueryRunner): Promise<void> {
console.log('\n\nINFO: Started migration for execution entity indexes.\n Depending on the number of saved executions, it may take a while.\n\n'); let tablePrefix = config.get('database.tablePrefix');
let tablePrefix = config.get('database.tablePrefix');
const tablePrefixPure = tablePrefix; const tablePrefixPure = tablePrefix;
const schema = config.get('database.postgresdb.schema'); const schema = config.get('database.postgresdb.schema');
if (schema) { if (schema) {
tablePrefix = schema + '.' + tablePrefix; tablePrefix = schema + '.' + tablePrefix;
} }
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}c4d999a5e90784e8caccf5589d"`); await queryRunner.query(
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}ca4a71b47f28ac6ea88293a8e2"`); `DROP INDEX "${schema}".IDX_${tablePrefixPure}c4d999a5e90784e8caccf5589d`,
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}33228da131bb1112247cf52a42" ON ${tablePrefix}execution_entity ("stoppedAt") `); );
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}58154df94c686818c99fb754ce" ON ${tablePrefix}execution_entity ("workflowId", "waitTill", "id") `); await queryRunner.query(
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}4f474ac92be81610439aaad61e" ON ${tablePrefix}execution_entity ("workflowId", "finished", "id") `); `DROP INDEX "${schema}".IDX_${tablePrefixPure}ca4a71b47f28ac6ea88293a8e2`,
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}72ffaaab9f04c2c1f1ea86e662" ON ${tablePrefix}execution_entity ("finished", "id") `); );
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}85b981df7b444f905f8bf50747" ON ${tablePrefix}execution_entity ("waitTill", "id") `); await queryRunner.query(
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}d160d4771aba5a0d78943edbe3" ON ${tablePrefix}execution_entity ("workflowId", "id") `); `CREATE INDEX "IDX_${tablePrefixPure}33228da131bb1112247cf52a42" ON ${tablePrefix}execution_entity ("stoppedAt") `,
} );
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}58154df94c686818c99fb754ce" ON ${tablePrefix}execution_entity ("workflowId", "waitTill", "id") `,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}4f474ac92be81610439aaad61e" ON ${tablePrefix}execution_entity ("workflowId", "finished", "id") `,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}72ffaaab9f04c2c1f1ea86e662" ON ${tablePrefix}execution_entity ("finished", "id") `,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}85b981df7b444f905f8bf50747" ON ${tablePrefix}execution_entity ("waitTill", "id") `,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}d160d4771aba5a0d78943edbe3" ON ${tablePrefix}execution_entity ("workflowId", "id") `,
);
}
public async down(queryRunner: QueryRunner): Promise<void> { public async down(queryRunner: QueryRunner): Promise<void> {
let tablePrefix = config.get('database.tablePrefix'); let tablePrefix = config.get('database.tablePrefix');
const tablePrefixPure = tablePrefix; const tablePrefixPure = tablePrefix;
const schema = config.get('database.postgresdb.schema'); const schema = config.get('database.postgresdb.schema');
if (schema) { if (schema) {
tablePrefix = schema + '.' + tablePrefix; tablePrefix = schema + '.' + tablePrefix;
} }
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}d160d4771aba5a0d78943edbe3"`); await queryRunner.query(
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}85b981df7b444f905f8bf50747"`); `DROP INDEX "${schema}"."IDX_${tablePrefixPure}d160d4771aba5a0d78943edbe3"`,
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}72ffaaab9f04c2c1f1ea86e662"`); );
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}4f474ac92be81610439aaad61e"`); await queryRunner.query(
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}58154df94c686818c99fb754ce"`); `DROP INDEX "${schema}"."IDX_${tablePrefixPure}85b981df7b444f905f8bf50747"`,
await queryRunner.query(`DROP INDEX IF EXISTS "${schema}"."IDX_${tablePrefixPure}33228da131bb1112247cf52a42"`); );
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}ca4a71b47f28ac6ea88293a8e2" ON ${tablePrefix}execution_entity ("waitTill") `); await queryRunner.query(
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefixPure}c4d999a5e90784e8caccf5589d" ON ${tablePrefix}execution_entity ("workflowId") `); `DROP INDEX "${schema}"."IDX_${tablePrefixPure}72ffaaab9f04c2c1f1ea86e662"`,
} );
await queryRunner.query(
`DROP INDEX "${schema}"."IDX_${tablePrefixPure}4f474ac92be81610439aaad61e"`,
);
await queryRunner.query(
`DROP INDEX "${schema}"."IDX_${tablePrefixPure}58154df94c686818c99fb754ce"`,
);
await queryRunner.query(
`DROP INDEX "${schema}"."IDX_${tablePrefixPure}33228da131bb1112247cf52a42"`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}ca4a71b47f28ac6ea88293a8e2" ON ${tablePrefix}execution_entity ("waitTill") `,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}c4d999a5e90784e8caccf5589d" ON ${tablePrefix}execution_entity ("workflowId") `,
);
}
} }

View file

@ -0,0 +1,160 @@
import { MigrationInterface, QueryRunner } from 'typeorm';
import { v4 as uuid } from 'uuid';
import config = require('../../../../config');
import { loadSurveyFromDisk } from '../../utils/migrationHelpers';
export class CreateUserManagement1646992772331 implements MigrationInterface {
name = 'CreateUserManagement1646992772331';
public async up(queryRunner: QueryRunner): Promise<void> {
let tablePrefix = config.get('database.tablePrefix');
const tablePrefixPure = tablePrefix;
const schema = config.get('database.postgresdb.schema');
if (schema) {
tablePrefix = schema + '.' + tablePrefix;
}
await queryRunner.query(
`CREATE TABLE ${tablePrefix}role (
"id" serial NOT NULL,
"name" VARCHAR(32) NOT NULL,
"scope" VARCHAR(255) NOT NULL,
"createdAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "PK_${tablePrefixPure}e853ce24e8200abe5721d2c6ac552b73" PRIMARY KEY ("id"),
CONSTRAINT "UQ_${tablePrefixPure}5b49d0f504f7ef31045a1fb2eb8" UNIQUE ("scope", "name")
);`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}user (
"id" UUID NOT NULL DEFAULT uuid_in(overlay(overlay(md5(random()::text || ':' || clock_timestamp()::text) placing '4' from 13) placing to_hex(floor(random()*(11-8+1) + 8)::int)::text from 17)::cstring),
"email" VARCHAR(255),
"firstName" VARCHAR(32),
"lastName" VARCHAR(32),
"password" VARCHAR(255),
"resetPasswordToken" VARCHAR(255),
"resetPasswordTokenExpiration" int,
"personalizationAnswers" text,
"createdAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
"globalRoleId" int NOT NULL,
CONSTRAINT "PK_${tablePrefixPure}ea8f538c94b6e352418254ed6474a81f" PRIMARY KEY ("id"),
CONSTRAINT "UQ_${tablePrefixPure}e12875dfb3b1d92d7d7c5377e2" UNIQUE (email),
CONSTRAINT "FK_${tablePrefixPure}f0609be844f9200ff4365b1bb3d" FOREIGN KEY ("globalRoleId") REFERENCES ${tablePrefix}role (id)
);`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}shared_workflow (
"createdAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
"roleId" INT NOT NULL,
"userId" UUID NOT NULL,
"workflowId" INT NOT NULL,
CONSTRAINT "PK_${tablePrefixPure}cc5d5a71c7b2591f5154ffb0c785e85e" PRIMARY KEY ("userId", "workflowId"),
CONSTRAINT "FK_${tablePrefixPure}3540da03964527aa24ae014b780" FOREIGN KEY ("roleId") REFERENCES ${tablePrefix}role ("id") ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT "FK_${tablePrefixPure}82b2fd9ec4e3e24209af8160282" FOREIGN KEY ("userId") REFERENCES ${tablePrefix}user ("id") ON DELETE CASCADE ON UPDATE NO ACTION,
CONSTRAINT "FK_${tablePrefixPure}b83f8d2530884b66a9c848c8b88" FOREIGN KEY ("workflowId") REFERENCES
${tablePrefixPure}workflow_entity ("id") ON DELETE CASCADE ON UPDATE NO ACTION
);`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}65a0933c0f19d278881653bf81d35064" ON "shared_workflow" ("workflowId");`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}shared_credentials (
"createdAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
"roleId" INT NOT NULL,
"userId" UUID NOT NULL,
"credentialsId" INT NOT NULL,
CONSTRAINT "PK_${tablePrefixPure}10dd1527ffb639609be7aadd98f628c6" PRIMARY KEY ("userId", "credentialsId"),
CONSTRAINT "FK_${tablePrefixPure}c68e056637562000b68f480815a" FOREIGN KEY ("roleId") REFERENCES ${tablePrefix}role ("id") ON DELETE NO ACTION ON UPDATE NO ACTION,
CONSTRAINT "FK_${tablePrefixPure}484f0327e778648dd04f1d70493" FOREIGN KEY ("userId") REFERENCES ${tablePrefix}user ("id") ON DELETE CASCADE ON UPDATE NO ACTION,
CONSTRAINT "FK_${tablePrefixPure}68661def1d4bcf2451ac8dbd949" FOREIGN KEY ("credentialsId") REFERENCES ${tablePrefix}credentials_entity ("id") ON DELETE CASCADE ON UPDATE NO ACTION
);`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefixPure}829d16efa0e265cb076d50eca8d21733" ON ${tablePrefix}shared_credentials ("credentialsId");`,
);
await queryRunner.query(
`CREATE TABLE ${tablePrefix}settings (
"key" VARCHAR(255) NOT NULL,
"value" TEXT NOT NULL,
"loadOnStartup" boolean NOT NULL DEFAULT false,
CONSTRAINT "PK_${tablePrefixPure}dc0fe14e6d9943f268e7b119f69ab8bd" PRIMARY KEY ("key")
);`,
);
await queryRunner.query(`DROP INDEX "IDX_${tablePrefixPure}a252c527c4c89237221fe2c0ab"`);
// Insert initial roles
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ('owner', 'global');`,
);
const instanceOwnerRole = await queryRunner.query('SELECT lastval() as "insertId"');
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ('member', 'global');`,
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ('owner', 'workflow');`,
);
const workflowOwnerRole = await queryRunner.query('SELECT lastval() as "insertId"');
await queryRunner.query(
`INSERT INTO ${tablePrefix}role (name, scope) VALUES ('owner', 'credential');`,
);
const credentialOwnerRole = await queryRunner.query('SELECT lastval() as "insertId"');
const survey = loadSurveyFromDisk();
const ownerUserId = uuid();
await queryRunner.query(
`INSERT INTO ${tablePrefix}user ("id", "globalRoleId", "personalizationAnswers") values ($1, $2, $3)`,
[ownerUserId, instanceOwnerRole[0].insertId, survey],
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}shared_workflow ("createdAt", "updatedAt", "roleId", "userId", "workflowId") select
NOW(), NOW(), '${workflowOwnerRole[0].insertId}', '${ownerUserId}', "id" FROM ${tablePrefix}workflow_entity`,
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}shared_credentials ("createdAt", "updatedAt", "roleId", "userId", "credentialsId") SELECT NOW(), NOW(), '${credentialOwnerRole[0].insertId}', '${ownerUserId}', "id" FROM ${tablePrefix} credentials_entity`,
);
await queryRunner.query(
`INSERT INTO ${tablePrefix}settings ("key", "value", "loadOnStartup") VALUES ('userManagement.isInstanceOwnerSetUp', 'false', true), ('userManagement.skipInstanceOwnerSetup', 'false', true)`,
);
}
public async down(queryRunner: QueryRunner): Promise<void> {
let tablePrefix = config.get('database.tablePrefix');
const tablePrefixPure = tablePrefix;
const schema = config.get('database.postgresdb.schema');
if (schema) {
tablePrefix = schema + '.' + tablePrefix;
}
await queryRunner.query(
`CREATE UNIQUE INDEX "IDX_${tablePrefixPure}a252c527c4c89237221fe2c0ab" ON ${tablePrefix}workflow_entity ("name")`,
);
await queryRunner.query(`DROP TABLE ${tablePrefix}shared_credentials`);
await queryRunner.query(`DROP TABLE ${tablePrefix}shared_workflow`);
await queryRunner.query(`DROP TABLE ${tablePrefix}user`);
await queryRunner.query(`DROP TABLE ${tablePrefix}role`);
await queryRunner.query(`DROP TABLE ${tablePrefix}settings`);
}
}

View file

@ -9,6 +9,7 @@ import { AddwaitTill1626176912946 } from './1626176912946-AddwaitTill';
import { UpdateWorkflowCredentials1630419189837 } from './1630419189837-UpdateWorkflowCredentials'; import { UpdateWorkflowCredentials1630419189837 } from './1630419189837-UpdateWorkflowCredentials';
import { AddExecutionEntityIndexes1644422880309 } from './1644422880309-AddExecutionEntityIndexes'; import { AddExecutionEntityIndexes1644422880309 } from './1644422880309-AddExecutionEntityIndexes';
import { IncreaseTypeVarcharLimit1646834195327 } from './1646834195327-IncreaseTypeVarcharLimit'; import { IncreaseTypeVarcharLimit1646834195327 } from './1646834195327-IncreaseTypeVarcharLimit';
import { CreateUserManagement1646992772331 } from './1646992772331-CreateUserManagement';
export const postgresMigrations = [ export const postgresMigrations = [
InitialMigration1587669153312, InitialMigration1587669153312,
@ -22,4 +23,5 @@ export const postgresMigrations = [
UpdateWorkflowCredentials1630419189837, UpdateWorkflowCredentials1630419189837,
AddExecutionEntityIndexes1644422880309, AddExecutionEntityIndexes1644422880309,
IncreaseTypeVarcharLimit1646834195327, IncreaseTypeVarcharLimit1646834195327,
CreateUserManagement1646992772331,
]; ];

View file

@ -1,21 +1,37 @@
import { import { MigrationInterface, QueryRunner } from 'typeorm';
MigrationInterface,
QueryRunner,
} from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class InitialMigration1588102412422 implements MigrationInterface { export class InitialMigration1588102412422 implements MigrationInterface {
name = 'InitialMigration1588102412422'; name = 'InitialMigration1588102412422';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`CREATE TABLE IF NOT EXISTS "${tablePrefix}credentials_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "data" text NOT NULL, "type" varchar(128) NOT NULL, "nodesAccess" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`, undefined); await queryRunner.query(
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8" ON "${tablePrefix}credentials_entity" ("type") `, undefined); `CREATE TABLE IF NOT EXISTS "${tablePrefix}credentials_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "data" text NOT NULL, "type" varchar(128) NOT NULL, "nodesAccess" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`,
await queryRunner.query(`CREATE TABLE IF NOT EXISTS "${tablePrefix}execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime NOT NULL, "workflowData" text NOT NULL, "workflowId" varchar)`, undefined); undefined,
await queryRunner.query(`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefix}c4d999a5e90784e8caccf5589d" ON "${tablePrefix}execution_entity" ("workflowId") `, undefined); );
await queryRunner.query(`CREATE TABLE IF NOT EXISTS "${tablePrefix}workflow_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "active" boolean NOT NULL, "nodes" text NOT NULL, "connections" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL, "settings" text, "staticData" text)`, undefined); await queryRunner.query(
`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8" ON "${tablePrefix}credentials_entity" ("type") `,
undefined,
);
await queryRunner.query(
`CREATE TABLE IF NOT EXISTS "${tablePrefix}execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime NOT NULL, "workflowData" text NOT NULL, "workflowId" varchar)`,
undefined,
);
await queryRunner.query(
`CREATE INDEX IF NOT EXISTS "IDX_${tablePrefix}c4d999a5e90784e8caccf5589d" ON "${tablePrefix}execution_entity" ("workflowId") `,
undefined,
);
await queryRunner.query(
`CREATE TABLE IF NOT EXISTS "${tablePrefix}workflow_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "active" boolean NOT NULL, "nodes" text NOT NULL, "connections" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL, "settings" text, "staticData" text)`,
undefined,
);
logMigrationEnd(this.name);
} }
async down(queryRunner: QueryRunner): Promise<void> { async down(queryRunner: QueryRunner): Promise<void> {
@ -27,5 +43,4 @@ export class InitialMigration1588102412422 implements MigrationInterface {
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8"`, undefined); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8"`, undefined);
await queryRunner.query(`DROP TABLE "${tablePrefix}credentials_entity"`, undefined); await queryRunner.query(`DROP TABLE "${tablePrefix}credentials_entity"`, undefined);
} }
} }

View file

@ -1,17 +1,20 @@
import { import { MigrationInterface, QueryRunner } from 'typeorm';
MigrationInterface,
QueryRunner,
} from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class WebhookModel1592445003908 implements MigrationInterface { export class WebhookModel1592445003908 implements MigrationInterface {
name = 'WebhookModel1592445003908'; name = 'WebhookModel1592445003908';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`CREATE TABLE IF NOT EXISTS ${tablePrefix}webhook_entity ("workflowId" integer NOT NULL, "webhookPath" varchar NOT NULL, "method" varchar NOT NULL, "node" varchar NOT NULL, PRIMARY KEY ("webhookPath", "method"))`); await queryRunner.query(
`CREATE TABLE IF NOT EXISTS ${tablePrefix}webhook_entity ("workflowId" integer NOT NULL, "webhookPath" varchar NOT NULL, "method" varchar NOT NULL, "node" varchar NOT NULL, PRIMARY KEY ("webhookPath", "method"))`,
);
logMigrationEnd(this.name);
} }
async down(queryRunner: QueryRunner): Promise<void> { async down(queryRunner: QueryRunner): Promise<void> {

View file

@ -1,14 +1,20 @@
import { MigrationInterface, QueryRunner } from "typeorm"; import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class CreateIndexStoppedAt1594825041918 implements MigrationInterface { export class CreateIndexStoppedAt1594825041918 implements MigrationInterface {
name = 'CreateIndexStoppedAt1594825041918'; name = 'CreateIndexStoppedAt1594825041918';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1" ON "${tablePrefix}execution_entity" ("stoppedAt") `); await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1" ON "${tablePrefix}execution_entity" ("stoppedAt") `,
);
logMigrationEnd(this.name);
} }
async down(queryRunner: QueryRunner): Promise<void> { async down(queryRunner: QueryRunner): Promise<void> {
@ -16,5 +22,4 @@ export class CreateIndexStoppedAt1594825041918 implements MigrationInterface {
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1"`); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1"`);
} }
} }

View file

@ -1,10 +1,13 @@
import {MigrationInterface, QueryRunner} from "typeorm"; import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class MakeStoppedAtNullable1607431743769 implements MigrationInterface { export class MakeStoppedAtNullable1607431743769 implements MigrationInterface {
name = 'MakeStoppedAtNullable1607431743769';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
// SQLite does not allow us to simply "alter column" // SQLite does not allow us to simply "alter column"
// We're hacking the way sqlite identifies tables // We're hacking the way sqlite identifies tables
@ -12,12 +15,16 @@ export class MakeStoppedAtNullable1607431743769 implements MigrationInterface {
// This is a very strict case when this can be done safely // This is a very strict case when this can be done safely
// As no collateral effects exist. // As no collateral effects exist.
await queryRunner.query(`PRAGMA writable_schema = 1; `, undefined); await queryRunner.query(`PRAGMA writable_schema = 1; `, undefined);
await queryRunner.query(`UPDATE SQLITE_MASTER SET SQL = 'CREATE TABLE IF NOT EXISTS "${tablePrefix}execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime, "workflowData" text NOT NULL, "workflowId" varchar)' WHERE NAME = "${tablePrefix}execution_entity";`, undefined); await queryRunner.query(
`UPDATE SQLITE_MASTER SET SQL = 'CREATE TABLE IF NOT EXISTS "${tablePrefix}execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime, "workflowData" text NOT NULL, "workflowId" varchar)' WHERE NAME = "${tablePrefix}execution_entity";`,
undefined,
);
await queryRunner.query(`PRAGMA writable_schema = 0;`, undefined); await queryRunner.query(`PRAGMA writable_schema = 0;`, undefined);
logMigrationEnd(this.name);
} }
async down(queryRunner: QueryRunner): Promise<void> { async down(queryRunner: QueryRunner): Promise<void> {
// This cannot be undone as the table might already have nullable values // This cannot be undone as the table might already have nullable values
} }
} }

View file

@ -1,26 +1,45 @@
import {MigrationInterface, QueryRunner} from "typeorm"; import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class AddWebhookId1611071044839 implements MigrationInterface { export class AddWebhookId1611071044839 implements MigrationInterface {
name = 'AddWebhookId1611071044839'; name = 'AddWebhookId1611071044839';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`CREATE TABLE "temporary_webhook_entity" ("workflowId" integer NOT NULL, "webhookPath" varchar NOT NULL, "method" varchar NOT NULL, "node" varchar NOT NULL, "webhookId" varchar, "pathLength" integer, PRIMARY KEY ("webhookPath", "method"))`); await queryRunner.query(
await queryRunner.query(`INSERT INTO "temporary_webhook_entity"("workflowId", "webhookPath", "method", "node") SELECT "workflowId", "webhookPath", "method", "node" FROM "${tablePrefix}webhook_entity"`); `CREATE TABLE "temporary_webhook_entity" ("workflowId" integer NOT NULL, "webhookPath" varchar NOT NULL, "method" varchar NOT NULL, "node" varchar NOT NULL, "webhookId" varchar, "pathLength" integer, PRIMARY KEY ("webhookPath", "method"))`,
);
await queryRunner.query(
`INSERT INTO "temporary_webhook_entity"("workflowId", "webhookPath", "method", "node") SELECT "workflowId", "webhookPath", "method", "node" FROM "${tablePrefix}webhook_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}webhook_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}webhook_entity"`);
await queryRunner.query(`ALTER TABLE "temporary_webhook_entity" RENAME TO "${tablePrefix}webhook_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}742496f199721a057051acf4c2" ON "${tablePrefix}webhook_entity" ("webhookId", "method", "pathLength") `); `ALTER TABLE "temporary_webhook_entity" RENAME TO "${tablePrefix}webhook_entity"`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}742496f199721a057051acf4c2" ON "${tablePrefix}webhook_entity" ("webhookId", "method", "pathLength") `,
);
logMigrationEnd(this.name);
} }
async down(queryRunner: QueryRunner): Promise<void> { async down(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}742496f199721a057051acf4c2"`); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}742496f199721a057051acf4c2"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}webhook_entity" RENAME TO "temporary_webhook_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE TABLE "${tablePrefix}webhook_entity" ("workflowId" integer NOT NULL, "webhookPath" varchar NOT NULL, "method" varchar NOT NULL, "node" varchar NOT NULL, PRIMARY KEY ("webhookPath", "method"))`); `ALTER TABLE "${tablePrefix}webhook_entity" RENAME TO "temporary_webhook_entity"`,
await queryRunner.query(`INSERT INTO "${tablePrefix}webhook_entity"("workflowId", "webhookPath", "method", "node") SELECT "workflowId", "webhookPath", "method", "node" FROM "temporary_webhook_entity"`); );
await queryRunner.query(
`CREATE TABLE "${tablePrefix}webhook_entity" ("workflowId" integer NOT NULL, "webhookPath" varchar NOT NULL, "method" varchar NOT NULL, "node" varchar NOT NULL, PRIMARY KEY ("webhookPath", "method"))`,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}webhook_entity"("workflowId", "webhookPath", "method", "node") SELECT "workflowId", "webhookPath", "method", "node" FROM "temporary_webhook_entity"`,
);
await queryRunner.query(`DROP TABLE "temporary_webhook_entity"`); await queryRunner.query(`DROP TABLE "temporary_webhook_entity"`);
} }
} }

View file

@ -1,38 +1,75 @@
import {MigrationInterface, QueryRunner} from "typeorm"; import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class CreateTagEntity1617213344594 implements MigrationInterface { export class CreateTagEntity1617213344594 implements MigrationInterface {
name = 'CreateTagEntity1617213344594'; name = 'CreateTagEntity1617213344594';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
// create tags table + relationship with workflow entity // create tags table + relationship with workflow entity
await queryRunner.query(`CREATE TABLE "${tablePrefix}tag_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(24) NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`); await queryRunner.query(
await queryRunner.query(`CREATE UNIQUE INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b" ON "${tablePrefix}tag_entity" ("name") `); `CREATE TABLE "${tablePrefix}tag_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(24) NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`,
await queryRunner.query(`CREATE TABLE "${tablePrefix}workflows_tags" ("workflowId" integer NOT NULL, "tagId" integer NOT NULL, CONSTRAINT "FK_54b2f0343d6a2078fa137443869" FOREIGN KEY ("workflowId") REFERENCES "${tablePrefix}workflow_entity" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, CONSTRAINT "FK_77505b341625b0b4768082e2171" FOREIGN KEY ("tagId") REFERENCES "${tablePrefix}tag_entity" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, PRIMARY KEY ("workflowId", "tagId"))`); );
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}54b2f0343d6a2078fa13744386" ON "${tablePrefix}workflows_tags" ("workflowId") `); await queryRunner.query(
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}77505b341625b0b4768082e217" ON "${tablePrefix}workflows_tags" ("tagId") `); `CREATE UNIQUE INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b" ON "${tablePrefix}tag_entity" ("name") `,
);
await queryRunner.query(
`CREATE TABLE "${tablePrefix}workflows_tags" ("workflowId" integer NOT NULL, "tagId" integer NOT NULL, CONSTRAINT "FK_54b2f0343d6a2078fa137443869" FOREIGN KEY ("workflowId") REFERENCES "${tablePrefix}workflow_entity" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, CONSTRAINT "FK_77505b341625b0b4768082e2171" FOREIGN KEY ("tagId") REFERENCES "${tablePrefix}tag_entity" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, PRIMARY KEY ("workflowId", "tagId"))`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}54b2f0343d6a2078fa13744386" ON "${tablePrefix}workflows_tags" ("workflowId") `,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}77505b341625b0b4768082e217" ON "${tablePrefix}workflows_tags" ("tagId") `,
);
// set default dates for `createdAt` and `updatedAt` // set default dates for `createdAt` and `updatedAt`
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8"`); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8"`);
await queryRunner.query(`CREATE TABLE "${tablePrefix}temporary_credentials_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "data" text NOT NULL, "type" varchar(32) NOT NULL, "nodesAccess" text NOT NULL, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')))`); await queryRunner.query(
await queryRunner.query(`INSERT INTO "${tablePrefix}temporary_credentials_entity"("id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt") SELECT "id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt" FROM "${tablePrefix}credentials_entity"`); `CREATE TABLE "${tablePrefix}temporary_credentials_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "data" text NOT NULL, "type" varchar(32) NOT NULL, "nodesAccess" text NOT NULL, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')))`,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}temporary_credentials_entity"("id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt") SELECT "id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt" FROM "${tablePrefix}credentials_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}credentials_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}credentials_entity"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}temporary_credentials_entity" RENAME TO "${tablePrefix}credentials_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8" ON "${tablePrefix}credentials_entity" ("type") `); `ALTER TABLE "${tablePrefix}temporary_credentials_entity" RENAME TO "${tablePrefix}credentials_entity"`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8" ON "${tablePrefix}credentials_entity" ("type") `,
);
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b"`); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b"`);
await queryRunner.query(`CREATE TABLE "${tablePrefix}temporary_tag_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(24) NOT NULL, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')))`); await queryRunner.query(
await queryRunner.query(`INSERT INTO "${tablePrefix}temporary_tag_entity"("id", "name", "createdAt", "updatedAt") SELECT "id", "name", "createdAt", "updatedAt" FROM "${tablePrefix}tag_entity"`); `CREATE TABLE "${tablePrefix}temporary_tag_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(24) NOT NULL, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')))`,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}temporary_tag_entity"("id", "name", "createdAt", "updatedAt") SELECT "id", "name", "createdAt", "updatedAt" FROM "${tablePrefix}tag_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}tag_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}tag_entity"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}temporary_tag_entity" RENAME TO "${tablePrefix}tag_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE UNIQUE INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b" ON "${tablePrefix}tag_entity" ("name") `); `ALTER TABLE "${tablePrefix}temporary_tag_entity" RENAME TO "${tablePrefix}tag_entity"`,
await queryRunner.query(`CREATE TABLE "${tablePrefix}temporary_workflow_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "active" boolean NOT NULL, "nodes" text, "connections" text NOT NULL, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "settings" text, "staticData" text)`); );
await queryRunner.query(`INSERT INTO "${tablePrefix}temporary_workflow_entity"("id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData") SELECT "id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData" FROM "${tablePrefix}workflow_entity"`); await queryRunner.query(
`CREATE UNIQUE INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b" ON "${tablePrefix}tag_entity" ("name") `,
);
await queryRunner.query(
`CREATE TABLE "${tablePrefix}temporary_workflow_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "active" boolean NOT NULL, "nodes" text, "connections" text NOT NULL, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "settings" text, "staticData" text)`,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}temporary_workflow_entity"("id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData") SELECT "id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData" FROM "${tablePrefix}workflow_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}workflow_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}workflow_entity"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}temporary_workflow_entity" RENAME TO "${tablePrefix}workflow_entity"`); await queryRunner.query(
`ALTER TABLE "${tablePrefix}temporary_workflow_entity" RENAME TO "${tablePrefix}workflow_entity"`,
);
logMigrationEnd(this.name);
} }
async down(queryRunner: QueryRunner): Promise<void> { async down(queryRunner: QueryRunner): Promise<void> {
@ -40,22 +77,44 @@ export class CreateTagEntity1617213344594 implements MigrationInterface {
// `createdAt` and `updatedAt` // `createdAt` and `updatedAt`
await queryRunner.query(`ALTER TABLE "${tablePrefix}workflow_entity" RENAME TO "${tablePrefix}temporary_workflow_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE TABLE "${tablePrefix}workflow_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "active" boolean NOT NULL, "nodes" text NOT NULL, "connections" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL, "settings" text, "staticData" text)`); `ALTER TABLE "${tablePrefix}workflow_entity" RENAME TO "${tablePrefix}temporary_workflow_entity"`,
await queryRunner.query(`INSERT INTO "${tablePrefix}workflow_entity"("id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData") SELECT "id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData" FROM "${tablePrefix}temporary_workflow_entity"`); );
await queryRunner.query(
`CREATE TABLE "${tablePrefix}workflow_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "active" boolean NOT NULL, "nodes" text NOT NULL, "connections" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL, "settings" text, "staticData" text)`,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}workflow_entity"("id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData") SELECT "id", "name", "active", "nodes", "connections", "createdAt", "updatedAt", "settings", "staticData" FROM "${tablePrefix}temporary_workflow_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}temporary_workflow_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}temporary_workflow_entity"`);
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b"`); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}tag_entity" RENAME TO "${tablePrefix}temporary_tag_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE TABLE "${tablePrefix}tag_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(24) NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`); `ALTER TABLE "${tablePrefix}tag_entity" RENAME TO "${tablePrefix}temporary_tag_entity"`,
await queryRunner.query(`INSERT INTO "${tablePrefix}tag_entity"("id", "name", "createdAt", "updatedAt") SELECT "id", "name", "createdAt", "updatedAt" FROM "${tablePrefix}temporary_tag_entity"`); );
await queryRunner.query(
`CREATE TABLE "${tablePrefix}tag_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(24) NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}tag_entity"("id", "name", "createdAt", "updatedAt") SELECT "id", "name", "createdAt", "updatedAt" FROM "${tablePrefix}temporary_tag_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}temporary_tag_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}temporary_tag_entity"`);
await queryRunner.query(`CREATE UNIQUE INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b" ON "${tablePrefix}tag_entity" ("name") `); await queryRunner.query(
`CREATE UNIQUE INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b" ON "${tablePrefix}tag_entity" ("name") `,
);
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8"`); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}credentials_entity" RENAME TO "temporary_credentials_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE TABLE "${tablePrefix}credentials_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "data" text NOT NULL, "type" varchar(32) NOT NULL, "nodesAccess" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`); `ALTER TABLE "${tablePrefix}credentials_entity" RENAME TO "temporary_credentials_entity"`,
await queryRunner.query(`INSERT INTO "${tablePrefix}credentials_entity"("id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt") SELECT "id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt" FROM "${tablePrefix}temporary_credentials_entity"`); );
await queryRunner.query(
`CREATE TABLE "${tablePrefix}credentials_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(128) NOT NULL, "data" text NOT NULL, "type" varchar(32) NOT NULL, "nodesAccess" text NOT NULL, "createdAt" datetime NOT NULL, "updatedAt" datetime NOT NULL)`,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}credentials_entity"("id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt") SELECT "id", "name", "data", "type", "nodesAccess", "createdAt", "updatedAt" FROM "${tablePrefix}temporary_credentials_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}temporary_credentials_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}temporary_credentials_entity"`);
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8" ON "credentials_entity" ("type") `); await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}07fde106c0b471d8cc80a64fc8" ON "credentials_entity" ("type") `,
);
// tags // tags
@ -65,5 +124,4 @@ export class CreateTagEntity1617213344594 implements MigrationInterface {
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b"`); await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}8f949d7a3a984759044054e89b"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}tag_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}tag_entity"`);
} }
} }

View file

@ -1,47 +1,64 @@
import {MigrationInterface, QueryRunner} from "typeorm"; import { MigrationInterface, QueryRunner } from 'typeorm';
import config = require("../../../../config"); import config = require('../../../../config');
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class UniqueWorkflowNames1620821879465 implements MigrationInterface { export class UniqueWorkflowNames1620821879465 implements MigrationInterface {
name = 'UniqueWorkflowNames1620821879465'; name = 'UniqueWorkflowNames1620821879465';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix'); logMigrationStart(this.name);
const workflowNames = await queryRunner.query(` const tablePrefix = config.get('database.tablePrefix');
const workflowNames = await queryRunner.query(`
SELECT name SELECT name
FROM "${tablePrefix}workflow_entity" FROM "${tablePrefix}workflow_entity"
`); `);
for (const { name } of workflowNames) { for (const { name } of workflowNames) {
const [duplicatesQuery, parameters] = queryRunner.connection.driver.escapeQueryWithParameters(` const [duplicatesQuery, parameters] = queryRunner.connection.driver.escapeQueryWithParameters(
`
SELECT id, name SELECT id, name
FROM "${tablePrefix}workflow_entity" FROM "${tablePrefix}workflow_entity"
WHERE name = :name WHERE name = :name
ORDER BY createdAt ASC ORDER BY createdAt ASC
`, { name }, {}); `,
{ name },
{},
);
const duplicates = await queryRunner.query(duplicatesQuery, parameters); const duplicates = await queryRunner.query(duplicatesQuery, parameters);
if (duplicates.length > 1) { if (duplicates.length > 1) {
await Promise.all(duplicates.map(({ id, name }: { id: number; name: string; }, index: number) => { await Promise.all(
duplicates.map(({ id, name }: { id: number; name: string }, index: number) => {
if (index === 0) return Promise.resolve(); if (index === 0) return Promise.resolve();
const [updateQuery, updateParams] = queryRunner.connection.driver.escapeQueryWithParameters(` const [updateQuery, updateParams] =
queryRunner.connection.driver.escapeQueryWithParameters(
`
UPDATE "${tablePrefix}workflow_entity" UPDATE "${tablePrefix}workflow_entity"
SET name = :name SET name = :name
WHERE id = '${id}' WHERE id = '${id}'
`, { name: `${name} ${index + 1}`}, {}); `,
{ name: `${name} ${index + 1}` },
{},
);
return queryRunner.query(updateQuery, updateParams); return queryRunner.query(updateQuery, updateParams);
})); }),
} );
} }
await queryRunner.query(`CREATE UNIQUE INDEX "IDX_${tablePrefix}943d8f922be094eb507cb9a7f9" ON "${tablePrefix}workflow_entity" ("name") `);
} }
async down(queryRunner: QueryRunner): Promise<void> { await queryRunner.query(
const tablePrefix = config.get('database.tablePrefix'); `CREATE UNIQUE INDEX "IDX_${tablePrefix}943d8f922be094eb507cb9a7f9" ON "${tablePrefix}workflow_entity" ("name") `,
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}943d8f922be094eb507cb9a7f9"`); );
}
logMigrationEnd(this.name);
}
async down(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`DROP INDEX "IDX_${tablePrefix}943d8f922be094eb507cb9a7f9"`);
}
} }

View file

@ -1,33 +1,55 @@
import { MigrationInterface, QueryRunner } from 'typeorm'; import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class AddWaitColumn1621707690587 implements MigrationInterface { export class AddWaitColumn1621707690587 implements MigrationInterface {
name = 'AddWaitColumn1621707690587'; name = 'AddWaitColumn1621707690587';
async up(queryRunner: QueryRunner): Promise<void> { async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
console.log('\n\nINFO: Started with migration for wait functionality.\n Depending on the number of saved executions, that may take a little bit.\n\n');
await queryRunner.query(`DROP TABLE IF EXISTS "${tablePrefix}temporary_execution_entity"`); await queryRunner.query(`DROP TABLE IF EXISTS "${tablePrefix}temporary_execution_entity"`);
await queryRunner.query(`CREATE TABLE "${tablePrefix}temporary_execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime, "workflowData" text NOT NULL, "workflowId" varchar, "waitTill" DATETIME)`, undefined); await queryRunner.query(
await queryRunner.query(`INSERT INTO "${tablePrefix}temporary_execution_entity"("id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId") SELECT "id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId" FROM "${tablePrefix}execution_entity"`); `CREATE TABLE "${tablePrefix}temporary_execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime, "workflowData" text NOT NULL, "workflowId" varchar, "waitTill" DATETIME)`,
undefined,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}temporary_execution_entity"("id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId") SELECT "id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId" FROM "${tablePrefix}execution_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}execution_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}execution_entity"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}temporary_execution_entity" RENAME TO "${tablePrefix}execution_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1" ON "${tablePrefix}execution_entity" ("stoppedAt")`); `ALTER TABLE "${tablePrefix}temporary_execution_entity" RENAME TO "${tablePrefix}execution_entity"`,
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}ca4a71b47f28ac6ea88293a8e2" ON "${tablePrefix}execution_entity" ("waitTill")`); );
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1" ON "${tablePrefix}execution_entity" ("stoppedAt")`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}ca4a71b47f28ac6ea88293a8e2" ON "${tablePrefix}execution_entity" ("waitTill")`,
);
await queryRunner.query(`VACUUM;`); await queryRunner.query(`VACUUM;`);
logMigrationEnd(this.name);
} }
async down(queryRunner: QueryRunner): Promise<void> { async down(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`CREATE TABLE IF NOT EXISTS "${tablePrefix}temporary_execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime, "workflowData" text NOT NULL, "workflowId" varchar)`, undefined); await queryRunner.query(
await queryRunner.query(`INSERT INTO "${tablePrefix}temporary_execution_entity"("id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId") SELECT "id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId" FROM "${tablePrefix}execution_entity"`); `CREATE TABLE IF NOT EXISTS "${tablePrefix}temporary_execution_entity" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "data" text NOT NULL, "finished" boolean NOT NULL, "mode" varchar NOT NULL, "retryOf" varchar, "retrySuccessId" varchar, "startedAt" datetime NOT NULL, "stoppedAt" datetime, "workflowData" text NOT NULL, "workflowId" varchar)`,
undefined,
);
await queryRunner.query(
`INSERT INTO "${tablePrefix}temporary_execution_entity"("id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId") SELECT "id", "data", "finished", "mode", "retryOf", "retrySuccessId", "startedAt", "stoppedAt", "workflowData", "workflowId" FROM "${tablePrefix}execution_entity"`,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}execution_entity"`); await queryRunner.query(`DROP TABLE "${tablePrefix}execution_entity"`);
await queryRunner.query(`ALTER TABLE "${tablePrefix}temporary_execution_entity" RENAME TO "${tablePrefix}execution_entity"`); await queryRunner.query(
await queryRunner.query(`CREATE INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1" ON "${tablePrefix}execution_entity" ("stoppedAt")`); `ALTER TABLE "${tablePrefix}temporary_execution_entity" RENAME TO "${tablePrefix}execution_entity"`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}cefb067df2402f6aed0638a6c1" ON "${tablePrefix}execution_entity" ("stoppedAt")`,
);
await queryRunner.query(`VACUUM;`); await queryRunner.query(`VACUUM;`);
} }
} }

View file

@ -1,6 +1,7 @@
import { MigrationInterface, QueryRunner } from 'typeorm'; import { MigrationInterface, QueryRunner } from 'typeorm';
import config = require('../../../../config'); import config = require('../../../../config');
import { MigrationHelpers } from '../../MigrationHelpers'; import { MigrationHelpers } from '../../MigrationHelpers';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
// replacing the credentials in workflows and execution // replacing the credentials in workflows and execution
// `nodeType: name` changes to `nodeType: { id, name }` // `nodeType: name` changes to `nodeType: { id, name }`
@ -9,8 +10,8 @@ export class UpdateWorkflowCredentials1630330987096 implements MigrationInterfac
name = 'UpdateWorkflowCredentials1630330987096'; name = 'UpdateWorkflowCredentials1630330987096';
public async up(queryRunner: QueryRunner): Promise<void> { public async up(queryRunner: QueryRunner): Promise<void> {
console.log('Start migration', this.name); logMigrationStart(this.name);
console.time(this.name);
const tablePrefix = config.get('database.tablePrefix'); const tablePrefix = config.get('database.tablePrefix');
const helpers = new MigrationHelpers(queryRunner); const helpers = new MigrationHelpers(queryRunner);
@ -146,7 +147,8 @@ export class UpdateWorkflowCredentials1630330987096 implements MigrationInterfac
queryRunner.query(updateQuery, updateParams); queryRunner.query(updateQuery, updateParams);
} }
}); });
console.timeEnd(this.name);
logMigrationEnd(this.name);
} }
public async down(queryRunner: QueryRunner): Promise<void> { public async down(queryRunner: QueryRunner): Promise<void> {

View file

@ -1,31 +1,49 @@
import { MigrationInterface, QueryRunner } from 'typeorm'; import { MigrationInterface, QueryRunner } from 'typeorm';
import * as config from '../../../../config'; import * as config from '../../../../config';
import { logMigrationEnd, logMigrationStart } from '../../utils/migrationHelpers';
export class AddExecutionEntityIndexes1644421939510 implements MigrationInterface { export class AddExecutionEntityIndexes1644421939510 implements MigrationInterface {
name = 'AddExecutionEntityIndexes1644421939510' name = 'AddExecutionEntityIndexes1644421939510';
public async up(queryRunner: QueryRunner): Promise<void> { public async up(queryRunner: QueryRunner): Promise<void> {
console.log('\n\nINFO: Started migration for execution entity indexes.\n Depending on the number of saved executions, it may take a while.\n\n'); logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix');
const tablePrefix = config.get('database.tablePrefix'); await queryRunner.query(`DROP INDEX IF EXISTS 'IDX_${tablePrefix}c4d999a5e90784e8caccf5589d'`);
await queryRunner.query(`DROP INDEX IF EXISTS 'IDX_${tablePrefix}ca4a71b47f28ac6ea88293a8e2'`); await queryRunner.query(`DROP INDEX 'IDX_${tablePrefix}ca4a71b47f28ac6ea88293a8e2'`);
await queryRunner.query(`CREATE INDEX IF NOT EXISTS 'IDX_${tablePrefix}06da892aaf92a48e7d3e400003' ON '${tablePrefix}execution_entity' ('workflowId', 'waitTill', 'id') `);
await queryRunner.query(`CREATE INDEX IF NOT EXISTS 'IDX_${tablePrefix}78d62b89dc1433192b86dce18a' ON '${tablePrefix}execution_entity' ('workflowId', 'finished', 'id') `);
await queryRunner.query(`CREATE INDEX IF NOT EXISTS 'IDX_${tablePrefix}1688846335d274033e15c846a4' ON '${tablePrefix}execution_entity' ('finished', 'id') `);
await queryRunner.query(`CREATE INDEX IF NOT EXISTS 'IDX_${tablePrefix}b94b45ce2c73ce46c54f20b5f9' ON '${tablePrefix}execution_entity' ('waitTill', 'id') `);
await queryRunner.query(`CREATE INDEX IF NOT EXISTS 'IDX_${tablePrefix}81fc04c8a17de15835713505e4' ON '${tablePrefix}execution_entity' ('workflowId', 'id') `);
}
public async down(queryRunner: QueryRunner): Promise<void> { await queryRunner.query(
const tablePrefix = config.get('database.tablePrefix'); `CREATE INDEX 'IDX_${tablePrefix}06da892aaf92a48e7d3e400003' ON '${tablePrefix}execution_entity' ('workflowId', 'waitTill', 'id') `,
);
await queryRunner.query(
`CREATE INDEX 'IDX_${tablePrefix}78d62b89dc1433192b86dce18a' ON '${tablePrefix}execution_entity' ('workflowId', 'finished', 'id') `,
);
await queryRunner.query(
`CREATE INDEX 'IDX_${tablePrefix}1688846335d274033e15c846a4' ON '${tablePrefix}execution_entity' ('finished', 'id') `,
);
await queryRunner.query(
`CREATE INDEX 'IDX_${tablePrefix}b94b45ce2c73ce46c54f20b5f9' ON '${tablePrefix}execution_entity' ('waitTill', 'id') `,
);
await queryRunner.query(
`CREATE INDEX 'IDX_${tablePrefix}81fc04c8a17de15835713505e4' ON '${tablePrefix}execution_entity' ('workflowId', 'id') `,
);
logMigrationEnd(this.name);
}
await queryRunner.query(`DROP INDEX IF EXISTS 'IDX_${tablePrefix}81fc04c8a17de15835713505e4'`); public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`DROP INDEX IF EXISTS 'IDX_${tablePrefix}b94b45ce2c73ce46c54f20b5f9'`); const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(`DROP INDEX IF EXISTS 'IDX_${tablePrefix}1688846335d274033e15c846a4'`);
await queryRunner.query(`DROP INDEX IF EXISTS 'IDX_${tablePrefix}78d62b89dc1433192b86dce18a'`);
await queryRunner.query(`DROP INDEX IF EXISTS 'IDX_${tablePrefix}06da892aaf92a48e7d3e400003'`);
await queryRunner.query(`CREATE INDEX IF NOT EXISTS 'IDX_${tablePrefix}ca4a71b47f28ac6ea88293a8e2' ON '${tablePrefix}execution_entity' ('waitTill') `);
}
await queryRunner.query(`DROP INDEX 'IDX_${tablePrefix}81fc04c8a17de15835713505e4'`);
await queryRunner.query(`DROP INDEX 'IDX_${tablePrefix}b94b45ce2c73ce46c54f20b5f9'`);
await queryRunner.query(`DROP INDEX 'IDX_${tablePrefix}1688846335d274033e15c846a4'`);
await queryRunner.query(`DROP INDEX 'IDX_${tablePrefix}78d62b89dc1433192b86dce18a'`);
await queryRunner.query(`DROP INDEX 'IDX_${tablePrefix}06da892aaf92a48e7d3e400003'`);
await queryRunner.query(
`CREATE INDEX 'IDX_${tablePrefix}ca4a71b47f28ac6ea88293a8e2' ON '${tablePrefix}execution_entity' ('waitTill') `,
);
await queryRunner.query(
`CREATE INDEX IF NOT EXISTS 'IDX_${tablePrefix}c4d999a5e90784e8caccf5589d' ON '${tablePrefix}execution_entity' ('workflowId') `,
);
}
} }

View file

@ -0,0 +1,118 @@
import { MigrationInterface, QueryRunner } from 'typeorm';
import { v4 as uuid } from 'uuid';
import config = require('../../../../config');
import {
loadSurveyFromDisk,
logMigrationEnd,
logMigrationStart,
} from '../../utils/migrationHelpers';
export class CreateUserManagement1646992772331 implements MigrationInterface {
name = 'CreateUserManagement1646992772331';
public async up(queryRunner: QueryRunner): Promise<void> {
logMigrationStart(this.name);
const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(
`CREATE TABLE "${tablePrefix}role" ("id" integer PRIMARY KEY AUTOINCREMENT NOT NULL, "name" varchar(32) NOT NULL, "scope" varchar NOT NULL, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), CONSTRAINT "UQ_${tablePrefix}5b49d0f504f7ef31045a1fb2eb8" UNIQUE ("scope", "name"))`,
);
await queryRunner.query(
`CREATE TABLE "${tablePrefix}user" ("id" varchar PRIMARY KEY NOT NULL, "email" varchar(255), "firstName" varchar(32), "lastName" varchar(32), "password" varchar, "resetPasswordToken" varchar, "resetPasswordTokenExpiration" integer DEFAULT NULL, "personalizationAnswers" text, "createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "globalRoleId" integer NOT NULL, CONSTRAINT "FK_${tablePrefix}f0609be844f9200ff4365b1bb3d" FOREIGN KEY ("globalRoleId") REFERENCES "${tablePrefix}role" ("id") ON DELETE NO ACTION ON UPDATE NO ACTION)`,
);
await queryRunner.query(
`CREATE UNIQUE INDEX "UQ_${tablePrefix}e12875dfb3b1d92d7d7c5377e2" ON "${tablePrefix}user" ("email")`,
);
await queryRunner.query(
`CREATE TABLE "${tablePrefix}shared_workflow" ("createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "roleId" integer NOT NULL, "userId" varchar NOT NULL, "workflowId" integer NOT NULL, CONSTRAINT "FK_${tablePrefix}3540da03964527aa24ae014b780" FOREIGN KEY ("roleId") REFERENCES "${tablePrefix}role" ("id") ON DELETE NO ACTION ON UPDATE NO ACTION, CONSTRAINT "FK_${tablePrefix}82b2fd9ec4e3e24209af8160282" FOREIGN KEY ("userId") REFERENCES "${tablePrefix}user" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, CONSTRAINT "FK_${tablePrefix}b83f8d2530884b66a9c848c8b88" FOREIGN KEY ("workflowId") REFERENCES "${tablePrefix}workflow_entity" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, PRIMARY KEY ("userId", "workflowId"))`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}65a0933c0f19d278881653bf81d35064" ON "${tablePrefix}shared_workflow" ("workflowId")`,
);
await queryRunner.query(
`CREATE TABLE "${tablePrefix}shared_credentials" ("createdAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "updatedAt" datetime(3) NOT NULL DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')), "roleId" integer NOT NULL, "userId" varchar NOT NULL, "credentialsId" integer NOT NULL, CONSTRAINT "FK_${tablePrefix}c68e056637562000b68f480815a" FOREIGN KEY ("roleId") REFERENCES "${tablePrefix}role" ("id") ON DELETE NO ACTION ON UPDATE NO ACTION, CONSTRAINT "FK_${tablePrefix}484f0327e778648dd04f1d70493" FOREIGN KEY ("userId") REFERENCES "${tablePrefix}user" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, CONSTRAINT "FK_${tablePrefix}68661def1d4bcf2451ac8dbd949" FOREIGN KEY ("credentialsId") REFERENCES "${tablePrefix}credentials_entity" ("id") ON DELETE CASCADE ON UPDATE NO ACTION, PRIMARY KEY ("userId", "credentialsId"))`,
);
await queryRunner.query(
`CREATE INDEX "IDX_${tablePrefix}829d16efa0e265cb076d50eca8d21733" ON "${tablePrefix}shared_credentials" ("credentialsId")`,
);
await queryRunner.query(
`CREATE TABLE "${tablePrefix}settings" ("key" TEXT NOT NULL,"value" TEXT NOT NULL DEFAULT \'\',"loadOnStartup" boolean NOT NULL default false,PRIMARY KEY("key"))`,
);
await queryRunner.query(`DROP INDEX IF EXISTS "IDX_${tablePrefix}943d8f922be094eb507cb9a7f9"`);
// Insert initial roles
await queryRunner.query(`
INSERT INTO "${tablePrefix}role" (name, scope)
VALUES ("owner", "global");
`);
const instanceOwnerRole = await queryRunner.query('SELECT last_insert_rowid() as insertId');
await queryRunner.query(`
INSERT INTO "${tablePrefix}role" (name, scope)
VALUES ("member", "global");
`);
await queryRunner.query(`
INSERT INTO "${tablePrefix}role" (name, scope)
VALUES ("owner", "workflow");
`);
const workflowOwnerRole = await queryRunner.query('SELECT last_insert_rowid() as insertId');
await queryRunner.query(`
INSERT INTO "${tablePrefix}role" (name, scope)
VALUES ("owner", "credential");
`);
const credentialOwnerRole = await queryRunner.query('SELECT last_insert_rowid() as insertId');
const survey = loadSurveyFromDisk();
const ownerUserId = uuid();
await queryRunner.query(
`
INSERT INTO "${tablePrefix}user" (id, globalRoleId, personalizationAnswers) values
(?, ?, ?)
`,
[ownerUserId, instanceOwnerRole[0].insertId, survey],
);
await queryRunner.query(`
INSERT INTO "${tablePrefix}shared_workflow" (createdAt, updatedAt, roleId, userId, workflowId)
select DATETIME('now'), DATETIME('now'), '${workflowOwnerRole[0].insertId}', '${ownerUserId}', id from "${tablePrefix}workflow_entity"
`);
await queryRunner.query(`
INSERT INTO "${tablePrefix}shared_credentials" (createdAt, updatedAt, roleId, userId, credentialsId)
select DATETIME('now'), DATETIME('now'), '${credentialOwnerRole[0].insertId}', '${ownerUserId}', id from "${tablePrefix}credentials_entity"
`);
await queryRunner.query(`
INSERT INTO "${tablePrefix}settings" (key, value, loadOnStartup) values
('userManagement.isInstanceOwnerSetUp', 'false', true), ('userManagement.skipInstanceOwnerSetup', 'false', true)
`);
logMigrationEnd(this.name);
}
public async down(queryRunner: QueryRunner): Promise<void> {
const tablePrefix = config.get('database.tablePrefix');
await queryRunner.query(
`CREATE UNIQUE INDEX "IDX_${tablePrefix}943d8f922be094eb507cb9a7f9" ON "${tablePrefix}workflow_entity" ("name") `,
);
await queryRunner.query(`DROP TABLE "${tablePrefix}shared_credentials"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}shared_workflow"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}user"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}role"`);
await queryRunner.query(`DROP TABLE "${tablePrefix}settings"`);
}
}

View file

@ -1,3 +1,5 @@
import config = require('../../../../config');
import { InitialMigration1588102412422 } from './1588102412422-InitialMigration'; import { InitialMigration1588102412422 } from './1588102412422-InitialMigration';
import { WebhookModel1592445003908 } from './1592445003908-WebhookModel'; import { WebhookModel1592445003908 } from './1592445003908-WebhookModel';
import { CreateIndexStoppedAt1594825041918 } from './1594825041918-CreateIndexStoppedAt'; import { CreateIndexStoppedAt1594825041918 } from './1594825041918-CreateIndexStoppedAt';
@ -8,8 +10,9 @@ import { UniqueWorkflowNames1620821879465 } from './1620821879465-UniqueWorkflow
import { AddWaitColumn1621707690587 } from './1621707690587-AddWaitColumn'; import { AddWaitColumn1621707690587 } from './1621707690587-AddWaitColumn';
import { UpdateWorkflowCredentials1630330987096 } from './1630330987096-UpdateWorkflowCredentials'; import { UpdateWorkflowCredentials1630330987096 } from './1630330987096-UpdateWorkflowCredentials';
import { AddExecutionEntityIndexes1644421939510 } from './1644421939510-AddExecutionEntityIndexes'; import { AddExecutionEntityIndexes1644421939510 } from './1644421939510-AddExecutionEntityIndexes';
import { CreateUserManagement1646992772331 } from './1646992772331-CreateUserManagement';
export const sqliteMigrations = [ const sqliteMigrations = [
InitialMigration1588102412422, InitialMigration1588102412422,
WebhookModel1592445003908, WebhookModel1592445003908,
CreateIndexStoppedAt1594825041918, CreateIndexStoppedAt1594825041918,
@ -20,4 +23,7 @@ export const sqliteMigrations = [
AddWaitColumn1621707690587, AddWaitColumn1621707690587,
UpdateWorkflowCredentials1630330987096, UpdateWorkflowCredentials1630330987096,
AddExecutionEntityIndexes1644421939510, AddExecutionEntityIndexes1644421939510,
CreateUserManagement1646992772331,
]; ];
export { sqliteMigrations };

View file

@ -0,0 +1,19 @@
/* eslint-disable @typescript-eslint/naming-convention */
import { registerDecorator } from 'class-validator';
export function NoXss() {
return (object: object, propertyName: string): void => {
registerDecorator({
name: 'NoXss',
target: object.constructor,
propertyName,
constraints: [propertyName],
options: { message: `Malicious ${propertyName}` },
validator: {
validate(value: string) {
return !/<(\s*)?(script|a|http)/.test(value);
},
},
});
};
}

View file

@ -0,0 +1,55 @@
import { readFileSync, rmSync } from 'fs';
import { UserSettings } from 'n8n-core';
import { getLogger } from '../../Logger';
const PERSONALIZATION_SURVEY_FILENAME = 'personalizationSurvey.json';
export function loadSurveyFromDisk(): string | null {
const userSettingsPath = UserSettings.getUserN8nFolderPath();
try {
const filename = `${userSettingsPath}/${PERSONALIZATION_SURVEY_FILENAME}`;
const surveyFile = readFileSync(filename, 'utf-8');
rmSync(filename);
const personalizationSurvey = JSON.parse(surveyFile) as object;
const kvPairs = Object.entries(personalizationSurvey);
if (!kvPairs.length) {
throw new Error('personalizationSurvey is empty');
} else {
// eslint-disable-next-line @typescript-eslint/naming-convention
const emptyKeys = kvPairs.reduce((acc, [_key, value]) => {
if (!value || (Array.isArray(value) && !value.length)) {
return acc + 1;
}
return acc;
}, 0);
if (emptyKeys === kvPairs.length) {
throw new Error('incomplete personalizationSurvey');
}
}
return surveyFile;
} catch (error) {
return null;
}
}
let logFinishTimeout: NodeJS.Timeout;
const disableLogging = process.argv[1].split('/').includes('jest');
export function logMigrationStart(migrationName: string): void {
if (disableLogging) return;
const logger = getLogger();
if (!logFinishTimeout) {
logger.warn('Migrations in progress, please do NOT stop the process.');
}
logger.debug(`Starting migration ${migrationName}`);
clearTimeout(logFinishTimeout);
}
export function logMigrationEnd(migrationName: string): void {
if (disableLogging) return;
const logger = getLogger();
logger.debug(`Finished migration ${migrationName}`);
logFinishTimeout = setTimeout(() => {
logger.warn('Migrations finished.');
}, 100);
}

View file

@ -0,0 +1,20 @@
// eslint-disable-next-line import/no-cycle
import { IPersonalizationSurveyAnswers } from '../../Interfaces';
export const idStringifier = {
from: (value: number): string | number => (value ? value.toString() : value),
to: (value: string): number | string => (value ? Number(value) : value),
};
/**
* Ensure a consistent return type for personalization answers in `User`.
* Answers currently stored as `TEXT` on Postgres.
*/
export const answersFormatter = {
to: (answers: IPersonalizationSurveyAnswers): IPersonalizationSurveyAnswers => answers,
from: (answers: IPersonalizationSurveyAnswers | string): IPersonalizationSurveyAnswers => {
return typeof answers === 'string'
? (JSON.parse(answers) as IPersonalizationSurveyAnswers)
: answers;
},
};

272
packages/cli/src/requests.d.ts vendored Normal file
View file

@ -0,0 +1,272 @@
/* eslint-disable import/no-cycle */
import express = require('express');
import {
IConnections,
ICredentialDataDecryptedObject,
ICredentialNodeAccess,
INode,
INodeCredentialTestRequest,
IRunData,
IWorkflowSettings,
} from 'n8n-workflow';
import { User } from './databases/entities/User';
import type { IExecutionDeleteFilter, IWorkflowDb } from '.';
import type { PublicUser } from './UserManagement/Interfaces';
export type AuthlessRequest<
RouteParams = {},
ResponseBody = {},
RequestBody = {},
RequestQuery = {},
> = express.Request<RouteParams, ResponseBody, RequestBody, RequestQuery>;
export type AuthenticatedRequest<
RouteParams = {},
ResponseBody = {},
RequestBody = {},
RequestQuery = {},
> = express.Request<RouteParams, ResponseBody, RequestBody, RequestQuery> & { user: User };
// ----------------------------------
// /workflows
// ----------------------------------
export declare namespace WorkflowRequest {
type RequestBody = Partial<{
id: string; // delete if sent
name: string;
nodes: INode[];
connections: IConnections;
settings: IWorkflowSettings;
active: boolean;
tags: string[];
}>;
type Create = AuthenticatedRequest<{}, {}, RequestBody>;
type Get = AuthenticatedRequest<{ id: string }>;
type Delete = Get;
type Update = AuthenticatedRequest<{ id: string }, {}, RequestBody>;
type NewName = express.Request<{}, {}, {}, { name?: string }>;
type GetAll = AuthenticatedRequest<{}, {}, {}, { filter: string }>;
type GetAllActive = AuthenticatedRequest;
type GetAllActivationErrors = Get;
type ManualRun = AuthenticatedRequest<
{},
{},
{
workflowData: IWorkflowDb;
runData: IRunData;
startNodes?: string[];
destinationNode?: string;
}
>;
}
// ----------------------------------
// /credentials
// ----------------------------------
export declare namespace CredentialRequest {
type RequestBody = Partial<{
id: string; // delete if sent
name: string;
type: string;
nodesAccess: ICredentialNodeAccess[];
data: ICredentialDataDecryptedObject;
}>;
type Create = AuthenticatedRequest<{}, {}, RequestBody>;
type Get = AuthenticatedRequest<{ id: string }, {}, {}, Record<string, string>>;
type Delete = Get;
type GetAll = AuthenticatedRequest<{}, {}, {}, { filter: string }>;
type Update = AuthenticatedRequest<{ id: string }, {}, RequestBody>;
type NewName = WorkflowRequest.NewName;
type Test = AuthenticatedRequest<{}, {}, INodeCredentialTestRequest>;
}
// ----------------------------------
// /executions
// ----------------------------------
export declare namespace ExecutionRequest {
namespace QueryParam {
type GetAll = {
filter: string; // '{ waitTill: string; finished: boolean, [other: string]: string }'
limit: string;
lastId: string;
firstId: string;
};
type GetAllCurrent = {
filter: string; // '{ workflowId: string }'
};
}
type GetAll = AuthenticatedRequest<{}, {}, {}, QueryParam.GetAll>;
type Get = AuthenticatedRequest<{ id: string }, {}, {}, { unflattedResponse: 'true' | 'false' }>;
type Delete = AuthenticatedRequest<{}, {}, IExecutionDeleteFilter>;
type Retry = AuthenticatedRequest<{ id: string }, {}, { loadWorkflow: boolean }, {}>;
type Stop = AuthenticatedRequest<{ id: string }>;
type GetAllCurrent = AuthenticatedRequest<{}, {}, {}, QueryParam.GetAllCurrent>;
}
// ----------------------------------
// /me
// ----------------------------------
export declare namespace MeRequest {
export type Settings = AuthenticatedRequest<
{},
{},
Pick<PublicUser, 'email' | 'firstName' | 'lastName'>
>;
export type Password = AuthenticatedRequest<
{},
{},
{ currentPassword: string; newPassword: string }
>;
export type SurveyAnswers = AuthenticatedRequest<{}, {}, Record<string, string> | {}>;
}
// ----------------------------------
// /owner
// ----------------------------------
export declare namespace OwnerRequest {
type Post = AuthenticatedRequest<
{},
{},
Partial<{
email: string;
password: string;
firstName: string;
lastName: string;
}>,
{}
>;
}
// ----------------------------------
// password reset endpoints
// ----------------------------------
export declare namespace PasswordResetRequest {
export type Email = AuthlessRequest<{}, {}, Pick<PublicUser, 'email'>>;
export type Credentials = AuthlessRequest<{}, {}, {}, { userId?: string; token?: string }>;
export type NewPassword = AuthlessRequest<
{},
{},
Pick<PublicUser, 'password'> & { token?: string; userId?: string }
>;
}
// ----------------------------------
// /users
// ----------------------------------
export declare namespace UserRequest {
export type Invite = AuthenticatedRequest<{}, {}, Array<{ email: string }>>;
export type ResolveSignUp = AuthlessRequest<
{},
{},
{},
{ inviterId?: string; inviteeId?: string }
>;
export type SignUp = AuthenticatedRequest<
{ id: string },
{ inviterId?: string; inviteeId?: string }
>;
export type Delete = AuthenticatedRequest<{ id: string }, {}, {}, { transferId?: string }>;
export type Reinvite = AuthenticatedRequest<{ id: string }>;
export type Update = AuthlessRequest<
{ id: string },
{},
{
inviterId: string;
firstName: string;
lastName: string;
password: string;
}
>;
}
// ----------------------------------
// /login
// ----------------------------------
export type LoginRequest = AuthlessRequest<
{},
{},
{
email: string;
password: string;
}
>;
// ----------------------------------
// oauth endpoints
// ----------------------------------
export declare namespace OAuthRequest {
namespace OAuth1Credential {
type Auth = AuthenticatedRequest<{}, {}, {}, { id: string }>;
type Callback = AuthenticatedRequest<
{},
{},
{},
{ oauth_verifier: string; oauth_token: string; cid: string }
>;
}
namespace OAuth2Credential {
type Auth = OAuth1Credential.Auth;
type Callback = AuthenticatedRequest<{}, {}, {}, { code: string; state: string }>;
}
}
// ----------------------------------
// /node-parameter-options
// ----------------------------------
export type NodeParameterOptionsRequest = AuthenticatedRequest<
{},
{},
{},
{
nodeTypeAndVersion: string;
methodName: string;
path: string;
currentNodeParameters: string;
credentials: string;
}
>;
// ----------------------------------
// /tags
// ----------------------------------
export declare namespace TagsRequest {
type Delete = AuthenticatedRequest<{ id: string }>;
}

View file

@ -1,3 +1,4 @@
/* eslint-disable import/no-cycle */
/* eslint-disable @typescript-eslint/no-unsafe-call */ /* eslint-disable @typescript-eslint/no-unsafe-call */
/* eslint-disable @typescript-eslint/no-unsafe-member-access */ /* eslint-disable @typescript-eslint/no-unsafe-member-access */
import TelemetryClient = require('@rudderstack/rudder-sdk-node'); import TelemetryClient = require('@rudderstack/rudder-sdk-node');
@ -184,12 +185,17 @@ export class Telemetry {
}); });
} }
async track(eventName: string, properties?: IDataObject): Promise<void> { async track(
eventName: string,
properties: { [key: string]: unknown; user_id?: string } = {},
): Promise<void> {
return new Promise<void>((resolve) => { return new Promise<void>((resolve) => {
if (this.client) { if (this.client) {
const { user_id } = properties;
Object.assign(properties, { instance_id: this.instanceId });
this.client.track( this.client.track(
{ {
userId: this.instanceId, userId: `${this.instanceId}${user_id ? `#${user_id}` : ''}`,
anonymousId: '000000000000', anonymousId: '000000000000',
event: eventName, event: eventName,
properties, properties,

View file

@ -0,0 +1,147 @@
import { hashSync, genSaltSync } from 'bcryptjs';
import express = require('express');
import validator from 'validator';
import { v4 as uuid } from 'uuid';
import config = require('../../config');
import * as utils from './shared/utils';
import { LOGGED_OUT_RESPONSE_BODY } from './shared/constants';
import { Db } from '../../src';
import { Role } from '../../src/databases/entities/Role';
import { randomEmail, randomValidPassword, randomName } from './shared/random';
import { getGlobalOwnerRole } from './shared/testDb';
import * as testDb from './shared/testDb';
let globalOwnerRole: Role;
let app: express.Application;
let testDbName = '';
beforeAll(async () => {
app = utils.initTestServer({ endpointGroups: ['auth'], applyAuth: true });
const initResult = await testDb.init();
testDbName = initResult.testDbName;
await testDb.truncate(['User'], testDbName);
globalOwnerRole = await getGlobalOwnerRole();
utils.initTestLogger();
utils.initTestTelemetry();
});
beforeEach(async () => {
await testDb.createUser({
id: uuid(),
email: TEST_USER.email,
firstName: TEST_USER.firstName,
lastName: TEST_USER.lastName,
password: hashSync(TEST_USER.password, genSaltSync(10)),
globalRole: globalOwnerRole,
});
config.set('userManagement.isInstanceOwnerSetUp', true);
await Db.collections.Settings!.update(
{ key: 'userManagement.isInstanceOwnerSetUp' },
{ value: JSON.stringify(true) },
);
});
afterEach(async () => {
await testDb.truncate(['User'], testDbName);
});
afterAll(async () => {
await testDb.terminate(testDbName);
});
test('POST /login should log user in', async () => {
const authlessAgent = utils.createAgent(app);
const response = await authlessAgent.post('/login').send({
email: TEST_USER.email,
password: TEST_USER.password,
});
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
password,
personalizationAnswers,
globalRole,
resetPasswordToken,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(TEST_USER.email);
expect(firstName).toBe(TEST_USER.firstName);
expect(lastName).toBe(TEST_USER.lastName);
expect(password).toBeUndefined();
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(globalRole).toBeDefined();
expect(globalRole.name).toBe('owner');
expect(globalRole.scope).toBe('global');
const authToken = utils.getAuthToken(response);
expect(authToken).toBeDefined();
});
test('GET /login should receive logged in user', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.get('/login');
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
password,
personalizationAnswers,
globalRole,
resetPasswordToken,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(TEST_USER.email);
expect(firstName).toBe(TEST_USER.firstName);
expect(lastName).toBe(TEST_USER.lastName);
expect(password).toBeUndefined();
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(globalRole).toBeDefined();
expect(globalRole.name).toBe('owner');
expect(globalRole.scope).toBe('global');
expect(response.headers['set-cookie']).toBeUndefined();
});
test('POST /logout should log user out', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.post('/logout');
expect(response.statusCode).toBe(200);
expect(response.body).toEqual(LOGGED_OUT_RESPONSE_BODY);
const authToken = utils.getAuthToken(response);
expect(authToken).toBeUndefined();
});
const TEST_USER = {
email: randomEmail(),
password: randomValidPassword(),
firstName: randomName(),
lastName: randomName(),
};

View file

@ -0,0 +1,59 @@
import express = require('express');
import * as request from 'supertest';
import {
REST_PATH_SEGMENT,
ROUTES_REQUIRING_AUTHORIZATION,
ROUTES_REQUIRING_AUTHENTICATION,
} from './shared/constants';
import * as utils from './shared/utils';
import * as testDb from './shared/testDb';
let app: express.Application;
let testDbName = '';
beforeAll(async () => {
app = utils.initTestServer({
applyAuth: true,
endpointGroups: ['me', 'auth', 'owner', 'users'],
});
const initResult = await testDb.init();
testDbName = initResult.testDbName;
utils.initTestLogger();
utils.initTestTelemetry();
});
afterAll(async () => {
await testDb.terminate(testDbName);
});
ROUTES_REQUIRING_AUTHENTICATION.concat(ROUTES_REQUIRING_AUTHORIZATION).forEach((route) => {
const [method, endpoint] = getMethodAndEndpoint(route);
test(`${route} should return 401 Unauthorized if no cookie`, async () => {
const response = await request(app)[method](endpoint).use(utils.prefix(REST_PATH_SEGMENT));
expect(response.statusCode).toBe(401);
});
});
ROUTES_REQUIRING_AUTHORIZATION.forEach(async (route) => {
const [method, endpoint] = getMethodAndEndpoint(route);
test(`${route} should return 403 Forbidden for member`, async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const response = await authMemberAgent[method](endpoint);
if (response.statusCode === 500) {
console.log(response);
}
expect(response.statusCode).toBe(403);
});
});
function getMethodAndEndpoint(route: string) {
return route.split(' ').map((segment, index) => {
return index % 2 === 0 ? segment.toLowerCase() : segment;
});
}

View file

@ -0,0 +1,551 @@
import express = require('express');
import { UserSettings } from 'n8n-core';
import { Db } from '../../src';
import { randomName, randomString } from './shared/random';
import * as utils from './shared/utils';
import type { CredentialPayload, SaveCredentialFunction } from './shared/types';
import { Role } from '../../src/databases/entities/Role';
import { User } from '../../src/databases/entities/User';
import * as testDb from './shared/testDb';
let app: express.Application;
let testDbName = '';
let saveCredential: SaveCredentialFunction;
beforeAll(async () => {
app = utils.initTestServer({
endpointGroups: ['credentials'],
applyAuth: true,
});
const initResult = await testDb.init();
testDbName = initResult.testDbName;
utils.initConfigFile();
const credentialOwnerRole = await testDb.getCredentialOwnerRole();
saveCredential = affixRoleToSaveCredential(credentialOwnerRole);
utils.initTestTelemetry();
});
beforeEach(async () => {
await testDb.createOwnerShell();
});
afterEach(async () => {
// do not combine calls - shared table must be cleared first and separately
await testDb.truncate(['SharedCredentials'], testDbName);
await testDb.truncate(['User', 'Credentials'], testDbName);
});
afterAll(async () => {
await testDb.terminate(testDbName);
});
test('POST /credentials should create cred', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const payload = credentialPayload();
const response = await authOwnerAgent.post('/credentials').send(payload);
expect(response.statusCode).toBe(200);
const { id, name, type, nodesAccess, data: encryptedData } = response.body.data;
expect(name).toBe(payload.name);
expect(type).toBe(payload.type);
expect(nodesAccess[0].nodeType).toBe(payload.nodesAccess[0].nodeType);
expect(encryptedData).not.toBe(payload.data);
const credential = await Db.collections.Credentials!.findOneOrFail(id);
expect(credential.name).toBe(payload.name);
expect(credential.type).toBe(payload.type);
expect(credential.nodesAccess[0].nodeType).toBe(payload.nodesAccess[0].nodeType);
expect(credential.data).not.toBe(payload.data);
const sharedCredential = await Db.collections.SharedCredentials!.findOneOrFail({
relations: ['user', 'credentials'],
where: { credentials: credential },
});
expect(sharedCredential.user.id).toBe(owner.id);
expect(sharedCredential.credentials.name).toBe(payload.name);
});
test('POST /credentials should fail with invalid inputs', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
for (const invalidPayload of INVALID_PAYLOADS) {
const response = await authOwnerAgent.post('/credentials').send(invalidPayload);
expect(response.statusCode).toBe(400);
}
});
test('POST /credentials should fail with missing encryption key', async () => {
const mock = jest.spyOn(UserSettings, 'getEncryptionKey');
mock.mockResolvedValue(undefined);
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.post('/credentials').send(credentialPayload());
expect(response.statusCode).toBe(500);
mock.mockRestore();
});
test('POST /credentials should ignore ID in payload', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const firstResponse = await authOwnerAgent
.post('/credentials')
.send({ id: '8', ...credentialPayload() });
expect(firstResponse.body.data.id).not.toBe('8');
const secondResponse = await authOwnerAgent
.post('/credentials')
.send({ id: 8, ...credentialPayload() });
expect(secondResponse.body.data.id).not.toBe(8);
});
test('DELETE /credentials/:id should delete owned cred for owner', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
const response = await authOwnerAgent.delete(`/credentials/${savedCredential.id}`);
expect(response.statusCode).toBe(200);
expect(response.body).toEqual({ data: true });
const deletedCredential = await Db.collections.Credentials!.findOne(savedCredential.id);
expect(deletedCredential).toBeUndefined(); // deleted
const deletedSharedCredential = await Db.collections.SharedCredentials!.findOne();
expect(deletedSharedCredential).toBeUndefined(); // deleted
});
test('DELETE /credentials/:id should delete non-owned cred for owner', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const member = await testDb.createUser();
const savedCredential = await saveCredential(credentialPayload(), { user: member });
const response = await authOwnerAgent.delete(`/credentials/${savedCredential.id}`);
expect(response.statusCode).toBe(200);
expect(response.body).toEqual({ data: true });
const deletedCredential = await Db.collections.Credentials!.findOne(savedCredential.id);
expect(deletedCredential).toBeUndefined(); // deleted
const deletedSharedCredential = await Db.collections.SharedCredentials!.findOne();
expect(deletedSharedCredential).toBeUndefined(); // deleted
});
test('DELETE /credentials/:id should delete owned cred for member', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const savedCredential = await saveCredential(credentialPayload(), { user: member });
const response = await authMemberAgent.delete(`/credentials/${savedCredential.id}`);
expect(response.statusCode).toBe(200);
expect(response.body).toEqual({ data: true });
const deletedCredential = await Db.collections.Credentials!.findOne(savedCredential.id);
expect(deletedCredential).toBeUndefined(); // deleted
const deletedSharedCredential = await Db.collections.SharedCredentials!.findOne();
expect(deletedSharedCredential).toBeUndefined(); // deleted
});
test('DELETE /credentials/:id should not delete non-owned cred for member', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
const response = await authMemberAgent.delete(`/credentials/${savedCredential.id}`);
expect(response.statusCode).toBe(404);
const shellCredential = await Db.collections.Credentials!.findOne(savedCredential.id);
expect(shellCredential).toBeDefined(); // not deleted
const deletedSharedCredential = await Db.collections.SharedCredentials!.findOne();
expect(deletedSharedCredential).toBeDefined(); // not deleted
});
test('DELETE /credentials/:id should fail if cred not found', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.delete('/credentials/123');
expect(response.statusCode).toBe(404);
});
test('PATCH /credentials/:id should update owned cred for owner', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
const patchPayload = credentialPayload();
const response = await authOwnerAgent
.patch(`/credentials/${savedCredential.id}`)
.send(patchPayload);
expect(response.statusCode).toBe(200);
const { id, name, type, nodesAccess, data: encryptedData } = response.body.data;
expect(name).toBe(patchPayload.name);
expect(type).toBe(patchPayload.type);
expect(nodesAccess[0].nodeType).toBe(patchPayload.nodesAccess[0].nodeType);
expect(encryptedData).not.toBe(patchPayload.data);
const credential = await Db.collections.Credentials!.findOneOrFail(id);
expect(credential.name).toBe(patchPayload.name);
expect(credential.type).toBe(patchPayload.type);
expect(credential.nodesAccess[0].nodeType).toBe(patchPayload.nodesAccess[0].nodeType);
expect(credential.data).not.toBe(patchPayload.data);
const sharedCredential = await Db.collections.SharedCredentials!.findOneOrFail({
relations: ['credentials'],
where: { credentials: credential },
});
expect(sharedCredential.credentials.name).toBe(patchPayload.name); // updated
});
test('PATCH /credentials/:id should update non-owned cred for owner', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const member = await testDb.createUser();
const savedCredential = await saveCredential(credentialPayload(), { user: member });
const patchPayload = credentialPayload();
const response = await authOwnerAgent
.patch(`/credentials/${savedCredential.id}`)
.send(patchPayload);
expect(response.statusCode).toBe(200);
const { id, name, type, nodesAccess, data: encryptedData } = response.body.data;
expect(name).toBe(patchPayload.name);
expect(type).toBe(patchPayload.type);
expect(nodesAccess[0].nodeType).toBe(patchPayload.nodesAccess[0].nodeType);
expect(encryptedData).not.toBe(patchPayload.data);
const credential = await Db.collections.Credentials!.findOneOrFail(id);
expect(credential.name).toBe(patchPayload.name);
expect(credential.type).toBe(patchPayload.type);
expect(credential.nodesAccess[0].nodeType).toBe(patchPayload.nodesAccess[0].nodeType);
expect(credential.data).not.toBe(patchPayload.data);
const sharedCredential = await Db.collections.SharedCredentials!.findOneOrFail({
relations: ['credentials'],
where: { credentials: credential },
});
expect(sharedCredential.credentials.name).toBe(patchPayload.name); // updated
});
test('PATCH /credentials/:id should update owned cred for member', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const savedCredential = await saveCredential(credentialPayload(), { user: member });
const patchPayload = credentialPayload();
const response = await authMemberAgent
.patch(`/credentials/${savedCredential.id}`)
.send(patchPayload);
expect(response.statusCode).toBe(200);
const { id, name, type, nodesAccess, data: encryptedData } = response.body.data;
expect(name).toBe(patchPayload.name);
expect(type).toBe(patchPayload.type);
expect(nodesAccess[0].nodeType).toBe(patchPayload.nodesAccess[0].nodeType);
expect(encryptedData).not.toBe(patchPayload.data);
const credential = await Db.collections.Credentials!.findOneOrFail(id);
expect(credential.name).toBe(patchPayload.name);
expect(credential.type).toBe(patchPayload.type);
expect(credential.nodesAccess[0].nodeType).toBe(patchPayload.nodesAccess[0].nodeType);
expect(credential.data).not.toBe(patchPayload.data);
const sharedCredential = await Db.collections.SharedCredentials!.findOneOrFail({
relations: ['credentials'],
where: { credentials: credential },
});
expect(sharedCredential.credentials.name).toBe(patchPayload.name); // updated
});
test('PATCH /credentials/:id should not update non-owned cred for member', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
const patchPayload = credentialPayload();
const response = await authMemberAgent
.patch(`/credentials/${savedCredential.id}`)
.send(patchPayload);
expect(response.statusCode).toBe(404);
const shellCredential = await Db.collections.Credentials!.findOneOrFail(savedCredential.id);
expect(shellCredential.name).not.toBe(patchPayload.name); // not updated
});
test('PATCH /credentials/:id should fail with invalid inputs', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
for (const invalidPayload of INVALID_PAYLOADS) {
const response = await authOwnerAgent
.patch(`/credentials/${savedCredential.id}`)
.send(invalidPayload);
expect(response.statusCode).toBe(400);
}
});
test('PATCH /credentials/:id should fail if cred not found', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.patch('/credentials/123').send(credentialPayload());
expect(response.statusCode).toBe(404);
});
test('PATCH /credentials/:id should fail with missing encryption key', async () => {
const mock = jest.spyOn(UserSettings, 'getEncryptionKey');
mock.mockResolvedValue(undefined);
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.post('/credentials').send(credentialPayload());
expect(response.statusCode).toBe(500);
mock.mockRestore();
});
test('GET /credentials should retrieve all creds for owner', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
for (let i = 0; i < 3; i++) {
await saveCredential(credentialPayload(), { user: owner });
}
const member = await testDb.createUser();
await saveCredential(credentialPayload(), { user: member });
const response = await authOwnerAgent.get('/credentials');
expect(response.statusCode).toBe(200);
expect(response.body.data.length).toBe(4); // 3 owner + 1 member
for (const credential of response.body.data) {
const { name, type, nodesAccess, data: encryptedData } = credential;
expect(typeof name).toBe('string');
expect(typeof type).toBe('string');
expect(typeof nodesAccess[0].nodeType).toBe('string');
expect(encryptedData).toBeUndefined();
}
});
test('GET /credentials should retrieve owned creds for member', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
for (let i = 0; i < 3; i++) {
await saveCredential(credentialPayload(), { user: member });
}
const response = await authMemberAgent.get('/credentials');
expect(response.statusCode).toBe(200);
expect(response.body.data.length).toBe(3);
for (const credential of response.body.data) {
const { name, type, nodesAccess, data: encryptedData } = credential;
expect(typeof name).toBe('string');
expect(typeof type).toBe('string');
expect(typeof nodesAccess[0].nodeType).toBe('string');
expect(encryptedData).toBeUndefined();
}
});
test('GET /credentials should not retrieve non-owned creds for member', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
for (let i = 0; i < 3; i++) {
await saveCredential(credentialPayload(), { user: owner });
}
const response = await authMemberAgent.get('/credentials');
expect(response.statusCode).toBe(200);
expect(response.body.data.length).toBe(0); // owner's creds not returned
});
test('GET /credentials/:id should retrieve owned cred for owner', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
const firstResponse = await authOwnerAgent.get(`/credentials/${savedCredential.id}`);
expect(firstResponse.statusCode).toBe(200);
expect(typeof firstResponse.body.data.name).toBe('string');
expect(typeof firstResponse.body.data.type).toBe('string');
expect(typeof firstResponse.body.data.nodesAccess[0].nodeType).toBe('string');
expect(firstResponse.body.data.data).toBeUndefined();
const secondResponse = await authOwnerAgent
.get(`/credentials/${savedCredential.id}`)
.query({ includeData: true });
expect(secondResponse.statusCode).toBe(200);
expect(typeof secondResponse.body.data.name).toBe('string');
expect(typeof secondResponse.body.data.type).toBe('string');
expect(typeof secondResponse.body.data.nodesAccess[0].nodeType).toBe('string');
expect(secondResponse.body.data.data).toBeDefined();
});
test('GET /credentials/:id should retrieve owned cred for member', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const savedCredential = await saveCredential(credentialPayload(), { user: member });
const firstResponse = await authMemberAgent.get(`/credentials/${savedCredential.id}`);
expect(firstResponse.statusCode).toBe(200);
expect(typeof firstResponse.body.data.name).toBe('string');
expect(typeof firstResponse.body.data.type).toBe('string');
expect(typeof firstResponse.body.data.nodesAccess[0].nodeType).toBe('string');
expect(firstResponse.body.data.data).toBeUndefined();
const secondResponse = await authMemberAgent
.get(`/credentials/${savedCredential.id}`)
.query({ includeData: true });
expect(secondResponse.statusCode).toBe(200);
expect(typeof secondResponse.body.data.name).toBe('string');
expect(typeof secondResponse.body.data.type).toBe('string');
expect(typeof secondResponse.body.data.nodesAccess[0].nodeType).toBe('string');
expect(secondResponse.body.data.data).toBeDefined();
});
test('GET /credentials/:id should not retrieve non-owned cred for member', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
const response = await authMemberAgent.get(`/credentials/${savedCredential.id}`);
expect(response.statusCode).toBe(404);
expect(response.body.data).toBeUndefined(); // owner's cred not returned
});
test('GET /credentials/:id should fail with missing encryption key', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const savedCredential = await saveCredential(credentialPayload(), { user: owner });
const mock = jest.spyOn(UserSettings, 'getEncryptionKey');
mock.mockResolvedValue(undefined);
const response = await authOwnerAgent
.get(`/credentials/${savedCredential.id}`)
.query({ includeData: true });
expect(response.statusCode).toBe(500);
mock.mockRestore();
});
test('GET /credentials/:id should return 404 if cred not found', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authMemberAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authMemberAgent.get('/credentials/789');
expect(response.statusCode).toBe(404);
});
const credentialPayload = () => ({
name: randomName(),
type: randomName(),
nodesAccess: [{ nodeType: randomName() }],
data: { accessToken: randomString(6, 16) },
});
const INVALID_PAYLOADS = [
{
type: randomName(),
nodesAccess: [{ nodeType: randomName() }],
data: { accessToken: randomString(6, 16) },
},
{
name: randomName(),
nodesAccess: [{ nodeType: randomName() }],
data: { accessToken: randomString(6, 16) },
},
{
name: randomName(),
type: randomName(),
data: { accessToken: randomString(6, 16) },
},
{
name: randomName(),
type: randomName(),
nodesAccess: [{ nodeType: randomName() }],
},
{},
[],
undefined,
];
function affixRoleToSaveCredential(role: Role) {
return (credentialPayload: CredentialPayload, { user }: { user: User }) =>
testDb.saveCredential(credentialPayload, { user, role });
}

View file

@ -0,0 +1,529 @@
import { hashSync, genSaltSync } from 'bcryptjs';
import express = require('express');
import validator from 'validator';
import config = require('../../config');
import * as utils from './shared/utils';
import { SUCCESS_RESPONSE_BODY } from './shared/constants';
import { Db } from '../../src';
import { Role } from '../../src/databases/entities/Role';
import { randomValidPassword, randomEmail, randomName, randomString } from './shared/random';
import * as testDb from './shared/testDb';
let app: express.Application;
let testDbName = '';
let globalOwnerRole: Role;
beforeAll(async () => {
app = utils.initTestServer({ endpointGroups: ['me'], applyAuth: true });
const initResult = await testDb.init();
testDbName = initResult.testDbName;
globalOwnerRole = await testDb.getGlobalOwnerRole();
utils.initTestLogger();
utils.initTestTelemetry();
});
afterAll(async () => {
await testDb.terminate(testDbName);
});
describe('Owner shell', () => {
beforeEach(async () => {
await testDb.createOwnerShell();
});
afterEach(async () => {
await testDb.truncate(['User'], testDbName);
});
test('GET /me should return sanitized owner shell', async () => {
const ownerShell = await Db.collections.User!.findOneOrFail();
const authOwnerShellAgent = utils.createAgent(app, { auth: true, user: ownerShell });
const response = await authOwnerShellAgent.get('/me');
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBeNull();
expect(firstName).toBeNull();
expect(lastName).toBeNull();
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(true);
expect(globalRole.name).toBe('owner');
expect(globalRole.scope).toBe('global');
});
test('PATCH /me should succeed with valid inputs', async () => {
const ownerShell = await Db.collections.User!.findOneOrFail();
const authOwnerShellAgent = utils.createAgent(app, { auth: true, user: ownerShell });
for (const validPayload of VALID_PATCH_ME_PAYLOADS) {
const response = await authOwnerShellAgent.patch('/me').send(validPayload);
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(validPayload.email);
expect(firstName).toBe(validPayload.firstName);
expect(lastName).toBe(validPayload.lastName);
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(false);
expect(globalRole.name).toBe('owner');
expect(globalRole.scope).toBe('global');
const storedOwnerShell = await Db.collections.User!.findOneOrFail(id);
expect(storedOwnerShell.email).toBe(validPayload.email);
expect(storedOwnerShell.firstName).toBe(validPayload.firstName);
expect(storedOwnerShell.lastName).toBe(validPayload.lastName);
}
});
test('PATCH /me should fail with invalid inputs', async () => {
const ownerShell = await Db.collections.User!.findOneOrFail();
const authOwnerShellAgent = utils.createAgent(app, { auth: true, user: ownerShell });
for (const invalidPayload of INVALID_PATCH_ME_PAYLOADS) {
const response = await authOwnerShellAgent.patch('/me').send(invalidPayload);
expect(response.statusCode).toBe(400);
const storedOwnerShell = await Db.collections.User!.findOneOrFail();
expect(storedOwnerShell.email).toBeNull();
expect(storedOwnerShell.firstName).toBeNull();
expect(storedOwnerShell.lastName).toBeNull();
}
});
test('PATCH /me/password should fail for shell', async () => {
const ownerShell = await Db.collections.User!.findOneOrFail();
const authOwnerShellAgent = utils.createAgent(app, { auth: true, user: ownerShell });
const validPasswordPayload = {
currentPassword: randomValidPassword(),
newPassword: randomValidPassword(),
};
const payloads = [validPasswordPayload, ...INVALID_PASSWORD_PAYLOADS];
for (const payload of payloads) {
const response = await authOwnerShellAgent.patch('/me/password').send(payload);
expect([400, 500].includes(response.statusCode)).toBe(true);
const storedMember = await Db.collections.User!.findOneOrFail();
if (payload.newPassword) {
expect(storedMember.password).not.toBe(payload.newPassword);
}
if (payload.currentPassword) {
expect(storedMember.password).not.toBe(payload.currentPassword);
}
}
const storedOwnerShell = await Db.collections.User!.findOneOrFail();
expect(storedOwnerShell.password).toBeNull();
});
test('POST /me/survey should succeed with valid inputs', async () => {
const ownerShell = await Db.collections.User!.findOneOrFail();
const authOwnerShellAgent = utils.createAgent(app, { auth: true, user: ownerShell });
const validPayloads = [SURVEY, {}];
for (const validPayload of validPayloads) {
const response = await authOwnerShellAgent.post('/me/survey').send(validPayload);
expect(response.statusCode).toBe(200);
expect(response.body).toEqual(SUCCESS_RESPONSE_BODY);
const { personalizationAnswers: storedAnswers } = await Db.collections.User!.findOneOrFail();
expect(storedAnswers).toEqual(validPayload);
}
});
});
describe('Member', () => {
beforeEach(async () => {
config.set('userManagement.isInstanceOwnerSetUp', true);
await Db.collections.Settings!.update(
{ key: 'userManagement.isInstanceOwnerSetUp' },
{ value: JSON.stringify(true) },
);
});
afterEach(async () => {
await testDb.truncate(['User'], testDbName);
});
test('GET /me should return sanitized member', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const response = await authMemberAgent.get('/me');
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(member.email);
expect(firstName).toBe(member.firstName);
expect(lastName).toBe(member.lastName);
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(false);
expect(globalRole.name).toBe('member');
expect(globalRole.scope).toBe('global');
});
test('PATCH /me should succeed with valid inputs', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
for (const validPayload of VALID_PATCH_ME_PAYLOADS) {
const response = await authMemberAgent.patch('/me').send(validPayload);
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(validPayload.email);
expect(firstName).toBe(validPayload.firstName);
expect(lastName).toBe(validPayload.lastName);
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(false);
expect(globalRole.name).toBe('member');
expect(globalRole.scope).toBe('global');
const storedMember = await Db.collections.User!.findOneOrFail(id);
expect(storedMember.email).toBe(validPayload.email);
expect(storedMember.firstName).toBe(validPayload.firstName);
expect(storedMember.lastName).toBe(validPayload.lastName);
}
});
test('PATCH /me should fail with invalid inputs', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
for (const invalidPayload of INVALID_PATCH_ME_PAYLOADS) {
const response = await authMemberAgent.patch('/me').send(invalidPayload);
expect(response.statusCode).toBe(400);
const storedMember = await Db.collections.User!.findOneOrFail();
expect(storedMember.email).toBe(member.email);
expect(storedMember.firstName).toBe(member.firstName);
expect(storedMember.lastName).toBe(member.lastName);
}
});
test('PATCH /me/password should succeed with valid inputs', async () => {
const memberPassword = randomValidPassword();
const member = await testDb.createUser({
password: hashSync(memberPassword, genSaltSync(10)),
});
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const validPayload = {
currentPassword: memberPassword,
newPassword: randomValidPassword(),
};
const response = await authMemberAgent.patch('/me/password').send(validPayload);
expect(response.statusCode).toBe(200);
expect(response.body).toEqual(SUCCESS_RESPONSE_BODY);
const storedMember = await Db.collections.User!.findOneOrFail();
expect(storedMember.password).not.toBe(member.password);
expect(storedMember.password).not.toBe(validPayload.newPassword);
});
test('PATCH /me/password should fail with invalid inputs', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
for (const payload of INVALID_PASSWORD_PAYLOADS) {
const response = await authMemberAgent.patch('/me/password').send(payload);
expect([400, 500].includes(response.statusCode)).toBe(true);
const storedMember = await Db.collections.User!.findOneOrFail();
if (payload.newPassword) {
expect(storedMember.password).not.toBe(payload.newPassword);
}
if (payload.currentPassword) {
expect(storedMember.password).not.toBe(payload.currentPassword);
}
}
});
test('POST /me/survey should succeed with valid inputs', async () => {
const member = await testDb.createUser();
const authMemberAgent = utils.createAgent(app, { auth: true, user: member });
const validPayloads = [SURVEY, {}];
for (const validPayload of validPayloads) {
const response = await authMemberAgent.post('/me/survey').send(validPayload);
expect(response.statusCode).toBe(200);
expect(response.body).toEqual(SUCCESS_RESPONSE_BODY);
const { personalizationAnswers: storedAnswers } = await Db.collections.User!.findOneOrFail();
expect(storedAnswers).toEqual(validPayload);
}
});
});
describe('Owner', () => {
beforeEach(async () => {
config.set('userManagement.isInstanceOwnerSetUp', true);
});
afterEach(async () => {
await testDb.truncate(['User'], testDbName);
});
test('GET /me should return sanitized owner', async () => {
const owner = await testDb.createUser({ globalRole: globalOwnerRole });
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.get('/me');
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(owner.email);
expect(firstName).toBe(owner.firstName);
expect(lastName).toBe(owner.lastName);
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(false);
expect(globalRole.name).toBe('owner');
expect(globalRole.scope).toBe('global');
});
test('PATCH /me should succeed with valid inputs', async () => {
const owner = await testDb.createUser({ globalRole: globalOwnerRole });
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
for (const validPayload of VALID_PATCH_ME_PAYLOADS) {
const response = await authOwnerAgent.patch('/me').send(validPayload);
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(validPayload.email);
expect(firstName).toBe(validPayload.firstName);
expect(lastName).toBe(validPayload.lastName);
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(false);
expect(globalRole.name).toBe('owner');
expect(globalRole.scope).toBe('global');
const storedOwner = await Db.collections.User!.findOneOrFail(id);
expect(storedOwner.email).toBe(validPayload.email);
expect(storedOwner.firstName).toBe(validPayload.firstName);
expect(storedOwner.lastName).toBe(validPayload.lastName);
}
});
});
const TEST_USER = {
email: randomEmail(),
firstName: randomName(),
lastName: randomName(),
};
const SURVEY = [
'codingSkill',
'companyIndustry',
'companySize',
'otherCompanyIndustry',
'otherWorkArea',
'workArea',
].reduce<Record<string, string>>((acc, cur) => {
return (acc[cur] = randomString(2, 10)), acc;
}, {});
const VALID_PATCH_ME_PAYLOADS = [
{
email: randomEmail(),
firstName: randomName(),
lastName: randomName(),
password: randomValidPassword(),
},
{
email: randomEmail(),
firstName: randomName(),
lastName: randomName(),
password: randomValidPassword(),
},
];
const INVALID_PATCH_ME_PAYLOADS = [
{
email: 'invalid',
firstName: randomName(),
lastName: randomName(),
},
{
email: randomEmail(),
firstName: '',
lastName: randomName(),
},
{
email: randomEmail(),
firstName: randomName(),
lastName: '',
},
{
email: randomEmail(),
firstName: 123,
lastName: randomName(),
},
{
firstName: randomName(),
lastName: randomName(),
},
{
firstName: randomName(),
},
{
lastName: randomName(),
},
{
email: randomEmail(),
firstName: 'John <script',
lastName: randomName(),
},
{
email: randomEmail(),
firstName: 'John <a',
lastName: randomName(),
},
];
const INVALID_PASSWORD_PAYLOADS = [
{
currentPassword: null,
newPassword: randomValidPassword(),
},
{
currentPassword: '',
newPassword: randomValidPassword(),
},
{
currentPassword: {},
newPassword: randomValidPassword(),
},
{
currentPassword: [],
newPassword: randomValidPassword(),
},
{
currentPassword: randomValidPassword(),
},
{
newPassword: randomValidPassword(),
},
{
currentPassword: randomValidPassword(),
newPassword: null,
},
{
currentPassword: randomValidPassword(),
newPassword: '',
},
{
currentPassword: randomValidPassword(),
newPassword: {},
},
{
currentPassword: randomValidPassword(),
newPassword: [],
},
];

View file

@ -0,0 +1,162 @@
import express = require('express');
import validator from 'validator';
import * as utils from './shared/utils';
import * as testDb from './shared/testDb';
import { Db } from '../../src';
import config = require('../../config');
import {
randomEmail,
randomName,
randomValidPassword,
randomInvalidPassword,
} from './shared/random';
let app: express.Application;
let testDbName = '';
beforeAll(async () => {
app = utils.initTestServer({ endpointGroups: ['owner'], applyAuth: true });
const initResult = await testDb.init();
testDbName = initResult.testDbName;
utils.initTestLogger();
utils.initTestTelemetry();
});
beforeEach(async () => {
await testDb.createOwnerShell();
});
afterEach(async () => {
await testDb.truncate(['User'], testDbName);
});
afterAll(async () => {
await testDb.terminate(testDbName);
});
test('POST /owner should create owner and enable isInstanceOwnerSetUp', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.post('/owner').send(TEST_USER);
expect(response.statusCode).toBe(200);
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBe(TEST_USER.email);
expect(firstName).toBe(TEST_USER.firstName);
expect(lastName).toBe(TEST_USER.lastName);
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(isPending).toBe(false);
expect(resetPasswordToken).toBeUndefined();
expect(globalRole.name).toBe('owner');
expect(globalRole.scope).toBe('global');
const storedOwner = await Db.collections.User!.findOneOrFail(id);
expect(storedOwner.password).not.toBe(TEST_USER.password);
expect(storedOwner.email).toBe(TEST_USER.email);
expect(storedOwner.firstName).toBe(TEST_USER.firstName);
expect(storedOwner.lastName).toBe(TEST_USER.lastName);
const isInstanceOwnerSetUpConfig = config.get('userManagement.isInstanceOwnerSetUp');
expect(isInstanceOwnerSetUpConfig).toBe(true);
const isInstanceOwnerSetUpSetting = await utils.isInstanceOwnerSetUp();
expect(isInstanceOwnerSetUpSetting).toBe(true);
});
test('POST /owner should fail with invalid inputs', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = await utils.createAgent(app, { auth: true, user: owner });
for (const invalidPayload of INVALID_POST_OWNER_PAYLOADS) {
const response = await authOwnerAgent.post('/owner').send(invalidPayload);
expect(response.statusCode).toBe(400);
}
});
test('POST /owner/skip-setup should persist skipping setup to the DB', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = await utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.post('/owner/skip-setup').send();
expect(response.statusCode).toBe(200);
const skipConfig = config.get('userManagement.skipInstanceOwnerSetup');
expect(skipConfig).toBe(true);
const { value } = await Db.collections.Settings!.findOneOrFail({
key: 'userManagement.skipInstanceOwnerSetup',
});
expect(value).toBe('true');
});
const TEST_USER = {
email: randomEmail(),
firstName: randomName(),
lastName: randomName(),
password: randomValidPassword(),
};
const INVALID_POST_OWNER_PAYLOADS = [
{
email: '',
firstName: randomName(),
lastName: randomName(),
password: randomValidPassword(),
},
{
email: randomEmail(),
firstName: '',
lastName: randomName(),
password: randomValidPassword(),
},
{
email: randomEmail(),
firstName: randomName(),
lastName: '',
password: randomValidPassword(),
},
{
email: randomEmail(),
firstName: randomName(),
lastName: randomName(),
password: randomInvalidPassword(),
},
{
firstName: randomName(),
lastName: randomName(),
},
{
firstName: randomName(),
},
{
lastName: randomName(),
},
{
email: randomEmail(),
firstName: 'John <script',
lastName: randomName(),
},
{
email: randomEmail(),
firstName: 'John <a',
lastName: randomName(),
},
];

View file

@ -0,0 +1,305 @@
import express = require('express');
import { v4 as uuid } from 'uuid';
import * as utils from './shared/utils';
import { Db } from '../../src';
import config = require('../../config');
import { compare } from 'bcryptjs';
import {
randomEmail,
randomInvalidPassword,
randomName,
randomValidPassword,
} from './shared/random';
import { Role } from '../../src/databases/entities/Role';
import * as testDb from './shared/testDb';
let app: express.Application;
let globalOwnerRole: Role;
let testDbName = '';
beforeAll(async () => {
app = utils.initTestServer({ endpointGroups: ['passwordReset'], applyAuth: true });
const initResult = await testDb.init();
testDbName = initResult.testDbName;
await testDb.truncate(['User'], testDbName);
globalOwnerRole = await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'global',
});
utils.initTestTelemetry();
utils.initTestLogger();
});
beforeEach(async () => {
jest.isolateModules(() => {
jest.mock('../../config');
});
config.set('userManagement.isInstanceOwnerSetUp', true);
config.set('userManagement.emails.mode', '');
await testDb.createUser({
id: INITIAL_TEST_USER.id,
email: INITIAL_TEST_USER.email,
password: INITIAL_TEST_USER.password,
firstName: INITIAL_TEST_USER.firstName,
lastName: INITIAL_TEST_USER.lastName,
globalRole: globalOwnerRole,
});
});
afterEach(async () => {
await testDb.truncate(['User'], testDbName);
});
afterAll(async () => {
await testDb.terminate(testDbName);
});
test('POST /forgot-password should send password reset email', async () => {
const authlessAgent = utils.createAgent(app);
const {
user,
pass,
smtp: { host, port, secure },
} = await utils.getSmtpTestAccount();
config.set('userManagement.emails.mode', 'smtp');
config.set('userManagement.emails.smtp.host', host);
config.set('userManagement.emails.smtp.port', port);
config.set('userManagement.emails.smtp.secure', secure);
config.set('userManagement.emails.smtp.auth.user', user);
config.set('userManagement.emails.smtp.auth.pass', pass);
const response = await authlessAgent
.post('/forgot-password')
.send({ email: INITIAL_TEST_USER.email });
expect(response.statusCode).toBe(200);
expect(response.body).toEqual({});
const owner = await Db.collections.User!.findOneOrFail({ email: INITIAL_TEST_USER.email });
expect(owner.resetPasswordToken).toBeDefined();
expect(owner.resetPasswordTokenExpiration).toBeGreaterThan(Math.ceil(Date.now() / 1000));
});
test('POST /forgot-password should fail if emailing is not set up', async () => {
const authlessAgent = utils.createAgent(app);
const response = await authlessAgent
.post('/forgot-password')
.send({ email: INITIAL_TEST_USER.email });
expect(response.statusCode).toBe(500);
const owner = await Db.collections.User!.findOneOrFail({ email: INITIAL_TEST_USER.email });
expect(owner.resetPasswordToken).toBeNull();
});
test('POST /forgot-password should fail with invalid inputs', async () => {
const authlessAgent = utils.createAgent(app);
config.set('userManagement.emails.mode', 'smtp');
const invalidPayloads = [
randomEmail(),
[randomEmail()],
{},
[{ name: randomName() }],
[{ email: randomName() }],
];
for (const invalidPayload of invalidPayloads) {
const response = await authlessAgent.post('/forgot-password').send(invalidPayload);
expect(response.statusCode).toBe(400);
const owner = await Db.collections.User!.findOneOrFail({ email: INITIAL_TEST_USER.email });
expect(owner.resetPasswordToken).toBeNull();
}
});
test('POST /forgot-password should fail if user is not found', async () => {
const authlessAgent = utils.createAgent(app);
config.set('userManagement.emails.mode', 'smtp');
const response = await authlessAgent.post('/forgot-password').send({ email: randomEmail() });
// response should have 200 to not provide any information to the requester
expect(response.statusCode).toBe(200);
});
test('GET /resolve-password-token should succeed with valid inputs', async () => {
const authlessAgent = utils.createAgent(app);
const resetPasswordToken = uuid();
const resetPasswordTokenExpiration = Math.floor(Date.now() / 1000) + 100;
await Db.collections.User!.update(INITIAL_TEST_USER.id, {
resetPasswordToken,
resetPasswordTokenExpiration,
});
const response = await authlessAgent
.get('/resolve-password-token')
.query({ userId: INITIAL_TEST_USER.id, token: resetPasswordToken });
expect(response.statusCode).toBe(200);
});
test('GET /resolve-password-token should fail with invalid inputs', async () => {
const authlessAgent = utils.createAgent(app);
config.set('userManagement.emails.mode', 'smtp');
const first = await authlessAgent.get('/resolve-password-token').query({ token: uuid() });
const second = await authlessAgent
.get('/resolve-password-token')
.query({ userId: INITIAL_TEST_USER.id });
for (const response of [first, second]) {
expect(response.statusCode).toBe(400);
}
});
test('GET /resolve-password-token should fail if user is not found', async () => {
const authlessAgent = utils.createAgent(app);
config.set('userManagement.emails.mode', 'smtp');
const response = await authlessAgent
.get('/resolve-password-token')
.query({ userId: INITIAL_TEST_USER.id, token: uuid() });
expect(response.statusCode).toBe(404);
});
test('GET /resolve-password-token should fail if token is expired', async () => {
const authlessAgent = utils.createAgent(app);
const resetPasswordToken = uuid();
const resetPasswordTokenExpiration = Math.floor(Date.now() / 1000) - 1;
await Db.collections.User!.update(INITIAL_TEST_USER.id, {
resetPasswordToken,
resetPasswordTokenExpiration,
});
config.set('userManagement.emails.mode', 'smtp');
const response = await authlessAgent
.get('/resolve-password-token')
.query({ userId: INITIAL_TEST_USER.id, token: resetPasswordToken });
expect(response.statusCode).toBe(404);
});
test('POST /change-password should succeed with valid inputs', async () => {
const authlessAgent = utils.createAgent(app);
const resetPasswordToken = uuid();
const resetPasswordTokenExpiration = Math.floor(Date.now() / 1000) + 100;
await Db.collections.User!.update(INITIAL_TEST_USER.id, {
resetPasswordToken,
resetPasswordTokenExpiration,
});
const passwordToStore = randomValidPassword();
const response = await authlessAgent.post('/change-password').send({
token: resetPasswordToken,
userId: INITIAL_TEST_USER.id,
password: passwordToStore,
});
expect(response.statusCode).toBe(200);
const authToken = utils.getAuthToken(response);
expect(authToken).toBeDefined();
const { password: storedPassword } = await Db.collections.User!.findOneOrFail(
INITIAL_TEST_USER.id,
);
const comparisonResult = await compare(passwordToStore, storedPassword!);
expect(comparisonResult).toBe(true);
expect(storedPassword).not.toBe(passwordToStore);
});
test('POST /change-password should fail with invalid inputs', async () => {
const authlessAgent = utils.createAgent(app);
const resetPasswordToken = uuid();
const resetPasswordTokenExpiration = Math.floor(Date.now() / 1000) + 100;
await Db.collections.User!.update(INITIAL_TEST_USER.id, {
resetPasswordToken,
resetPasswordTokenExpiration,
});
const invalidPayloads = [
{ token: uuid() },
{ id: INITIAL_TEST_USER.id },
{ password: randomValidPassword() },
{ token: uuid(), id: INITIAL_TEST_USER.id },
{ token: uuid(), password: randomValidPassword() },
{ id: INITIAL_TEST_USER.id, password: randomValidPassword() },
{
id: INITIAL_TEST_USER.id,
password: randomInvalidPassword(),
token: resetPasswordToken,
},
{
id: INITIAL_TEST_USER.id,
password: randomValidPassword(),
token: uuid(),
},
];
const { password: originalHashedPassword } = await Db.collections.User!.findOneOrFail();
for (const invalidPayload of invalidPayloads) {
const response = await authlessAgent.post('/change-password').query(invalidPayload);
expect(response.statusCode).toBe(400);
const { password: fetchedHashedPassword } = await Db.collections.User!.findOneOrFail();
expect(originalHashedPassword).toBe(fetchedHashedPassword);
}
});
test('POST /change-password should fail when token has expired', async () => {
const authlessAgent = utils.createAgent(app);
const resetPasswordToken = uuid();
const resetPasswordTokenExpiration = Math.floor(Date.now() / 1000) - 1;
await Db.collections.User!.update(INITIAL_TEST_USER.id, {
resetPasswordToken,
resetPasswordTokenExpiration,
});
const passwordToStore = randomValidPassword();
const response = await authlessAgent.post('/change-password').send({
token: resetPasswordToken,
userId: INITIAL_TEST_USER.id,
password: passwordToStore,
});
expect(response.statusCode).toBe(404);
});
const INITIAL_TEST_USER = {
id: uuid(),
email: randomEmail(),
firstName: randomName(),
lastName: randomName(),
password: randomValidPassword(),
};

View file

@ -0,0 +1,21 @@
import superagent = require('superagent');
import { ObjectLiteral } from 'typeorm';
/**
* Make `SuperTest<T>` string-indexable.
*/
declare module 'supertest' {
interface SuperTest<T extends superagent.SuperAgentRequest>
extends superagent.SuperAgent<T>,
Record<string, any> {}
}
/**
* Prevent `repository.delete({})` (non-criteria) from triggering the type error
* `Expression produces a union type that is too complex to represent.ts(2590)`
*/
declare module 'typeorm' {
interface Repository<Entity extends ObjectLiteral> {
delete(criteria: {}): Promise<void>;
}
}

View file

@ -0,0 +1,59 @@
import config = require('../../../config');
export const REST_PATH_SEGMENT = config.get('endpoints.rest') as Readonly<string>;
export const AUTHLESS_ENDPOINTS: Readonly<string[]> = [
'healthz',
'metrics',
config.get('endpoints.webhook') as string,
config.get('endpoints.webhookWaiting') as string,
config.get('endpoints.webhookTest') as string,
];
export const SUCCESS_RESPONSE_BODY = {
data: {
success: true,
},
} as const;
export const LOGGED_OUT_RESPONSE_BODY = {
data: {
loggedOut: true,
},
};
/**
* Routes requiring a valid `n8n-auth` cookie for a user, either owner or member.
*/
export const ROUTES_REQUIRING_AUTHENTICATION: Readonly<string[]> = [
'GET /me',
'PATCH /me',
'PATCH /me/password',
'POST /me/survey',
'POST /owner',
'GET /non-existent',
];
/**
* Routes requiring a valid `n8n-auth` cookie for an owner.
*/
export const ROUTES_REQUIRING_AUTHORIZATION: Readonly<string[]> = [
'POST /users',
'GET /users',
'DELETE /users/123',
'POST /users/123/reinvite',
'POST /owner',
'POST /owner/skip-setup',
];
/**
* Name of the connection used for creating and dropping a Postgres DB
* for each suite test run.
*/
export const BOOTSTRAP_POSTGRES_CONNECTION_NAME: Readonly<string> = 'n8n_bs_postgres';
/**
* Name of the connection (and database) used for creating and dropping a MySQL DB
* for each suite test run.
*/
export const BOOTSTRAP_MYSQL_CONNECTION_NAME: Readonly<string> = 'n8n_bs_mysql';

View file

@ -0,0 +1,41 @@
import { randomBytes } from 'crypto';
import { MAX_PASSWORD_LENGTH, MIN_PASSWORD_LENGTH } from '../../../src/databases/entities/User';
/**
* Create a random alphanumeric string of random length between two limits, both inclusive.
* Limits should be even numbers (round down otherwise).
*/
export function randomString(min: number, max: number) {
const randomInteger = Math.floor(Math.random() * (max - min) + min) + 1;
return randomBytes(randomInteger / 2).toString('hex');
}
const chooseRandomly = <T>(array: T[]) => array[Math.floor(Math.random() * array.length)];
const randomDigit = () => Math.floor(Math.random() * 10);
const randomUppercaseLetter = () => chooseRandomly('ABCDEFGHIJKLMNOPQRSTUVWXYZ'.split(''));
export const randomValidPassword = () =>
randomString(MIN_PASSWORD_LENGTH, MAX_PASSWORD_LENGTH) + randomUppercaseLetter() + randomDigit();
export const randomInvalidPassword = () =>
chooseRandomly([
randomString(1, MIN_PASSWORD_LENGTH - 1),
randomString(MAX_PASSWORD_LENGTH + 2, MAX_PASSWORD_LENGTH + 100),
'abcdefgh', // valid length, no number, no uppercase
'abcdefg1', // valid length, has number, no uppercase
'abcdefgA', // valid length, no number, has uppercase
'abcdefA', // invalid length, no number, has uppercase
'abcdef1', // invalid length, has number, no uppercase
'abcdeA1', // invalid length, has number, has uppercase
'abcdefg', // invalid length, no number, no uppercase
]);
export const randomEmail = () => `${randomName()}@${randomName()}.${randomTopLevelDomain()}`;
const POPULAR_TOP_LEVEL_DOMAINS = ['com', 'org', 'net', 'io', 'edu'];
const randomTopLevelDomain = () => chooseRandomly(POPULAR_TOP_LEVEL_DOMAINS);
export const randomName = () => randomString(4, 8);

View file

@ -0,0 +1,389 @@
import { createConnection, getConnection, ConnectionOptions } from 'typeorm';
import { Credentials, UserSettings } from 'n8n-core';
import config = require('../../../config');
import { BOOTSTRAP_MYSQL_CONNECTION_NAME, BOOTSTRAP_POSTGRES_CONNECTION_NAME } from './constants';
import { DatabaseType, Db, ICredentialsDb, IDatabaseCollections } from '../../../src';
import { randomEmail, randomName, randomString, randomValidPassword } from './random';
import { CredentialsEntity } from '../../../src/databases/entities/CredentialsEntity';
import { RESPONSE_ERROR_MESSAGES } from '../../../src/constants';
import { entities } from '../../../src/databases/entities';
import { mysqlMigrations } from '../../../src/databases/mysqldb/migrations';
import { postgresMigrations } from '../../../src/databases/postgresdb/migrations';
import { sqliteMigrations } from '../../../src/databases/sqlite/migrations';
import type { Role } from '../../../src/databases/entities/Role';
import type { User } from '../../../src/databases/entities/User';
import type { CredentialPayload } from './types';
/**
* Initialize one test DB per suite run, with bootstrap connection if needed.
*/
export async function init() {
const dbType = config.get('database.type') as DatabaseType;
if (dbType === 'sqlite') {
// no bootstrap connection required
const testDbName = `n8n_test_sqlite_${randomString(6, 10)}_${Date.now()}`;
await Db.init(getSqliteOptions({ name: testDbName }));
await getConnection(testDbName).runMigrations({ transaction: 'none' });
return { testDbName };
}
if (dbType === 'postgresdb') {
let bootstrapPostgres;
const bootstrapPostgresOptions = getBootstrapPostgresOptions();
try {
bootstrapPostgres = await createConnection(bootstrapPostgresOptions);
} catch (error) {
const { username, password, host, port, schema } = bootstrapPostgresOptions;
console.error(
`ERROR: Failed to connect to Postgres default DB 'postgres'.\nPlease review your Postgres connection options:\n\thost: "${host}"\n\tusername: "${username}"\n\tpassword: "${password}"\n\tport: "${port}"\n\tschema: "${schema}"\nFix by setting correct values via environment variables:\n\texport DB_POSTGRESDB_HOST=value\n\texport DB_POSTGRESDB_USER=value\n\texport DB_POSTGRESDB_PASSWORD=value\n\texport DB_POSTGRESDB_PORT=value\n\texport DB_POSTGRESDB_SCHEMA=value`,
);
process.exit(1);
}
const testDbName = `pg_${randomString(6, 10)}_${Date.now()}_n8n_test`;
await bootstrapPostgres.query(`CREATE DATABASE ${testDbName};`);
await Db.init(getPostgresOptions({ name: testDbName }));
return { testDbName };
}
if (dbType === 'mysqldb') {
const bootstrapMysql = await createConnection(getBootstrapMySqlOptions());
const testDbName = `mysql_${randomString(6, 10)}_${Date.now()}_n8n_test`;
await bootstrapMysql.query(`CREATE DATABASE ${testDbName};`);
await Db.init(getMySqlOptions({ name: testDbName }));
return { testDbName };
}
throw new Error(`Unrecognized DB type: ${dbType}`);
}
/**
* Drop test DB, closing bootstrap connection if existing.
*/
export async function terminate(testDbName: string) {
const dbType = config.get('database.type') as DatabaseType;
if (dbType === 'sqlite') {
await getConnection(testDbName).close();
}
if (dbType === 'postgresdb') {
await getConnection(testDbName).close();
const bootstrapPostgres = getConnection(BOOTSTRAP_POSTGRES_CONNECTION_NAME);
await bootstrapPostgres.query(`DROP DATABASE ${testDbName}`);
await bootstrapPostgres.close();
}
if (dbType === 'mysqldb') {
await getConnection(testDbName).close();
const bootstrapMySql = getConnection(BOOTSTRAP_MYSQL_CONNECTION_NAME);
await bootstrapMySql.query(`DROP DATABASE ${testDbName}`);
await bootstrapMySql.close();
}
}
/**
* Truncate DB tables for specified entities.
*
* @param entities Array of entity names whose tables to truncate.
* @param testDbName Name of the test DB to truncate tables in.
*/
export async function truncate(entities: Array<keyof IDatabaseCollections>, testDbName: string) {
const dbType = config.get('database.type');
if (dbType === 'sqlite') {
const testDb = getConnection(testDbName);
await testDb.query('PRAGMA foreign_keys=OFF');
await Promise.all(entities.map((entity) => Db.collections[entity]!.clear()));
return testDb.query('PRAGMA foreign_keys=ON');
}
const map: { [K in keyof IDatabaseCollections]: string } = {
Credentials: 'credentials_entity',
Workflow: 'workflow_entity',
Execution: 'execution_entity',
Tag: 'tag_entity',
Webhook: 'webhook_entity',
Role: 'role',
User: 'user',
SharedCredentials: 'shared_credentials',
SharedWorkflow: 'shared_workflow',
Settings: 'settings',
};
if (dbType === 'postgresdb') {
return Promise.all(
entities.map((entity) =>
getConnection(testDbName).query(
`TRUNCATE TABLE "${map[entity]}" RESTART IDENTITY CASCADE;`,
),
),
);
}
// MySQL truncation requires globals, which cannot be safely manipulated by parallel tests
if (dbType === 'mysqldb') {
await Promise.all(
entities.map(async (entity) => {
await Db.collections[entity]!.delete({});
await getConnection(testDbName).query(`ALTER TABLE ${map[entity]} AUTO_INCREMENT = 1;`);
}),
);
}
}
// ----------------------------------
// credential creation
// ----------------------------------
/**
* Save a credential to the test DB, sharing it with a user.
*/
export async function saveCredential(
credentialPayload: CredentialPayload,
{ user, role }: { user: User; role: Role },
) {
const newCredential = new CredentialsEntity();
Object.assign(newCredential, credentialPayload);
const encryptedData = await encryptCredentialData(newCredential);
Object.assign(newCredential, encryptedData);
const savedCredential = await Db.collections.Credentials!.save(newCredential);
savedCredential.data = newCredential.data;
await Db.collections.SharedCredentials!.save({
user,
credentials: savedCredential,
role,
});
return savedCredential;
}
// ----------------------------------
// user creation
// ----------------------------------
/**
* Store a user in the DB, defaulting to a `member`.
*/
export async function createUser(attributes: Partial<User> = {}): Promise<User> {
const { email, password, firstName, lastName, globalRole, ...rest } = attributes;
const user = {
email: email ?? randomEmail(),
password: password ?? randomValidPassword(),
firstName: firstName ?? randomName(),
lastName: lastName ?? randomName(),
globalRole: globalRole ?? (await getGlobalMemberRole()),
...rest,
};
return Db.collections.User!.save(user);
}
export async function createOwnerShell() {
const globalRole = await getGlobalOwnerRole();
return Db.collections.User!.save({ globalRole });
}
export async function createMemberShell() {
const globalRole = await getGlobalMemberRole();
return Db.collections.User!.save({ globalRole });
}
// ----------------------------------
// role fetchers
// ----------------------------------
export async function getGlobalOwnerRole() {
return await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'global',
});
}
export async function getGlobalMemberRole() {
return await Db.collections.Role!.findOneOrFail({
name: 'member',
scope: 'global',
});
}
export async function getWorkflowOwnerRole() {
return await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'workflow',
});
}
export async function getCredentialOwnerRole() {
return await Db.collections.Role!.findOneOrFail({
name: 'owner',
scope: 'credential',
});
}
export function getAllRoles() {
return Promise.all([
getGlobalOwnerRole(),
getGlobalMemberRole(),
getWorkflowOwnerRole(),
getCredentialOwnerRole(),
]);
}
// ----------------------------------
// connection options
// ----------------------------------
/**
* Generate options for an in-memory sqlite database connection,
* one per test suite run.
*/
export const getSqliteOptions = ({ name }: { name: string }): ConnectionOptions => {
return {
name,
type: 'sqlite',
database: ':memory:',
entityPrefix: '',
dropSchema: true,
migrations: sqliteMigrations,
migrationsTableName: 'migrations',
migrationsRun: false,
};
};
/**
* Generate options for a bootstrap Postgres connection,
* to create and drop test Postgres databases.
*/
export const getBootstrapPostgresOptions = () => {
const username = config.get('database.postgresdb.user');
const password = config.get('database.postgresdb.password');
const host = config.get('database.postgresdb.host');
const port = config.get('database.postgresdb.port');
const schema = config.get('database.postgresdb.schema');
return {
name: BOOTSTRAP_POSTGRES_CONNECTION_NAME,
type: 'postgres',
database: 'postgres', // pre-existing default database
host,
port,
username,
password,
schema,
} as const;
};
export const getPostgresOptions = ({ name }: { name: string }): ConnectionOptions => {
const username = config.get('database.postgresdb.user');
const password = config.get('database.postgresdb.password');
const host = config.get('database.postgresdb.host');
const port = config.get('database.postgresdb.port');
const schema = config.get('database.postgresdb.schema');
return {
name,
type: 'postgres',
database: name,
host,
port,
username,
password,
entityPrefix: '',
schema,
dropSchema: true,
migrations: postgresMigrations,
migrationsRun: true,
migrationsTableName: 'migrations',
entities: Object.values(entities),
synchronize: false,
logging: false,
};
};
/**
* Generate options for a bootstrap MySQL connection,
* to create and drop test MySQL databases.
*/
export const getBootstrapMySqlOptions = (): ConnectionOptions => {
const username = config.get('database.mysqldb.user');
const password = config.get('database.mysqldb.password');
const host = config.get('database.mysqldb.host');
const port = config.get('database.mysqldb.port');
return {
name: BOOTSTRAP_MYSQL_CONNECTION_NAME,
database: BOOTSTRAP_MYSQL_CONNECTION_NAME,
type: 'mysql',
host,
port,
username,
password,
};
};
/**
* Generate options for a MySQL database connection,
* one per test suite run.
*/
export const getMySqlOptions = ({ name }: { name: string }): ConnectionOptions => {
const username = config.get('database.mysqldb.user');
const password = config.get('database.mysqldb.password');
const host = config.get('database.mysqldb.host');
const port = config.get('database.mysqldb.port');
return {
name,
database: name,
type: 'mysql',
host,
port,
username,
password,
migrations: mysqlMigrations,
migrationsTableName: 'migrations',
migrationsRun: true,
};
};
// ----------------------------------
// encryption
// ----------------------------------
async function encryptCredentialData(credential: CredentialsEntity) {
const encryptionKey = await UserSettings.getEncryptionKey();
if (!encryptionKey) {
throw new Error(RESPONSE_ERROR_MESSAGES.NO_ENCRYPTION_KEY);
}
const coreCredential = new Credentials(
{ id: null, name: credential.name },
credential.type,
credential.nodesAccess,
);
// @ts-ignore
coreCredential.setData(credential.data, encryptionKey);
return coreCredential.getDataToSave() as ICredentialsDb;
}

View file

@ -0,0 +1,28 @@
import type { ICredentialDataDecryptedObject, ICredentialNodeAccess } from 'n8n-workflow';
import type { ICredentialsDb } from '../../../src';
import type { CredentialsEntity } from '../../../src/databases/entities/CredentialsEntity';
import type { User } from '../../../src/databases/entities/User';
export type SmtpTestAccount = {
user: string;
pass: string;
smtp: {
host: string;
port: number;
secure: boolean;
};
};
type EndpointGroup = 'me' | 'users' | 'auth' | 'owner' | 'passwordReset' | 'credentials';
export type CredentialPayload = {
name: string;
type: string;
nodesAccess: ICredentialNodeAccess[];
data: ICredentialDataDecryptedObject;
};
export type SaveCredentialFunction = (
credentialPayload: CredentialPayload,
{ user }: { user: User },
) => Promise<CredentialsEntity & ICredentialsDb>;

View file

@ -0,0 +1,220 @@
import { randomBytes } from 'crypto';
import { existsSync } from 'fs';
import express = require('express');
import * as superagent from 'superagent';
import * as request from 'supertest';
import { URL } from 'url';
import bodyParser = require('body-parser');
import * as util from 'util';
import { createTestAccount } from 'nodemailer';
import { INodeTypes, LoggerProxy } from 'n8n-workflow';
import { UserSettings } from 'n8n-core';
import config = require('../../../config');
import { AUTHLESS_ENDPOINTS, REST_PATH_SEGMENT } from './constants';
import { AUTH_COOKIE_NAME } from '../../../src/constants';
import { addRoutes as authMiddleware } from '../../../src/UserManagement/routes';
import { Db, ExternalHooks, InternalHooksManager } from '../../../src';
import { meNamespace as meEndpoints } from '../../../src/UserManagement/routes/me';
import { usersNamespace as usersEndpoints } from '../../../src/UserManagement/routes/users';
import { authenticationMethods as authEndpoints } from '../../../src/UserManagement/routes/auth';
import { ownerNamespace as ownerEndpoints } from '../../../src/UserManagement/routes/owner';
import { passwordResetNamespace as passwordResetEndpoints } from '../../../src/UserManagement/routes/passwordReset';
import { issueJWT } from '../../../src/UserManagement/auth/jwt';
import { getLogger } from '../../../src/Logger';
import { credentialsController } from '../../../src/api/credentials.api';
import type { User } from '../../../src/databases/entities/User';
import { Telemetry } from '../../../src/telemetry';
import type { EndpointGroup, SmtpTestAccount } from './types';
import type { N8nApp } from '../../../src/UserManagement/Interfaces';
/**
* Initialize a test server.
*
* @param applyAuth Whether to apply auth middleware to test server.
* @param endpointGroups Groups of endpoints to apply to test server.
*/
export function initTestServer({
applyAuth,
endpointGroups,
}: {
applyAuth: boolean;
endpointGroups?: EndpointGroup[];
}) {
const testServer = {
app: express(),
restEndpoint: REST_PATH_SEGMENT,
...(endpointGroups?.includes('credentials') ? { externalHooks: ExternalHooks() } : {}),
};
testServer.app.use(bodyParser.json());
testServer.app.use(bodyParser.urlencoded({ extended: true }));
config.set('userManagement.jwtSecret', 'My JWT secret');
config.set('userManagement.isInstanceOwnerSetUp', false);
if (applyAuth) {
authMiddleware.apply(testServer, [AUTHLESS_ENDPOINTS, REST_PATH_SEGMENT]);
}
if (!endpointGroups) return testServer.app;
const [routerEndpoints, functionEndpoints] = classifyEndpointGroups(endpointGroups);
if (routerEndpoints.length) {
const map: Record<string, express.Router> = {
credentials: credentialsController,
};
for (const group of routerEndpoints) {
testServer.app.use(`/${testServer.restEndpoint}/${group}`, map[group]);
}
}
if (functionEndpoints.length) {
const map: Record<string, (this: N8nApp) => void> = {
me: meEndpoints,
users: usersEndpoints,
auth: authEndpoints,
owner: ownerEndpoints,
passwordReset: passwordResetEndpoints,
};
for (const group of functionEndpoints) {
map[group].apply(testServer);
}
}
return testServer.app;
}
export function initTestTelemetry() {
const mockNodeTypes = { nodeTypes: {} } as INodeTypes;
void InternalHooksManager.init('test-instance-id', 'test-version', mockNodeTypes);
jest.spyOn(Telemetry.prototype, 'track').mockResolvedValue();
}
/**
* Classify endpoint groups into `routerEndpoints` (newest, using `express.Router`),
* and `functionEndpoints` (legacy, namespaced inside a function).
*/
const classifyEndpointGroups = (endpointGroups: string[]) => {
const routerEndpoints: string[] = [];
const functionEndpoints: string[] = [];
endpointGroups.forEach((group) =>
(group === 'credentials' ? routerEndpoints : functionEndpoints).push(group),
);
return [routerEndpoints, functionEndpoints];
};
// ----------------------------------
// initializers
// ----------------------------------
/**
* Initialize a silent logger for test runs.
*/
export function initTestLogger() {
config.set('logs.output', 'file'); // declutter console output
LoggerProxy.init(getLogger());
}
/**
* Initialize a user settings config file if non-existent.
*/
export function initConfigFile() {
const settingsPath = UserSettings.getUserSettingsPath();
if (!existsSync(settingsPath)) {
const userSettings = { encryptionKey: randomBytes(24).toString('base64') };
UserSettings.writeUserSettings(userSettings, settingsPath);
}
}
// ----------------------------------
// request agent
// ----------------------------------
/**
* Create a request agent, optionally with an auth cookie.
*/
export function createAgent(app: express.Application, options?: { auth: true; user: User }) {
const agent = request.agent(app);
agent.use(prefix(REST_PATH_SEGMENT));
if (options?.auth && options?.user) {
const { token } = issueJWT(options.user);
agent.jar.setCookie(`${AUTH_COOKIE_NAME}=${token}`);
}
return agent;
}
/**
* Plugin to prefix a path segment into a request URL pathname.
*
* Example: http://127.0.0.1:62100/me/password → http://127.0.0.1:62100/rest/me/password
*/
export function prefix(pathSegment: string) {
return function (request: superagent.SuperAgentRequest) {
const url = new URL(request.url);
// enforce consistency at call sites
if (url.pathname[0] !== '/') {
throw new Error('Pathname must start with a forward slash');
}
url.pathname = pathSegment + url.pathname;
request.url = url.toString();
return request;
};
}
/**
* Extract the value (token) of the auth cookie in a response.
*/
export function getAuthToken(response: request.Response, authCookieName = AUTH_COOKIE_NAME) {
const cookies: string[] = response.headers['set-cookie'];
if (!cookies) {
throw new Error("No 'set-cookie' header found in response");
}
const authCookie = cookies.find((c) => c.startsWith(`${authCookieName}=`));
if (!authCookie) return undefined;
const match = authCookie.match(new RegExp(`(^| )${authCookieName}=(?<token>[^;]+)`));
if (!match || !match.groups) return undefined;
return match.groups.token;
}
// ----------------------------------
// settings
// ----------------------------------
export async function isInstanceOwnerSetUp() {
const { value } = await Db.collections.Settings!.findOneOrFail({
key: 'userManagement.isInstanceOwnerSetUp',
});
return Boolean(value);
}
// ----------------------------------
// SMTP
// ----------------------------------
/**
* Get an SMTP test account from https://ethereal.email to test sending emails.
*/
export const getSmtpTestAccount = util.promisify<SmtpTestAccount>(createTestAccount);

View file

@ -0,0 +1,600 @@
import express = require('express');
import validator from 'validator';
import { v4 as uuid } from 'uuid';
import { compare } from 'bcryptjs';
import { Db } from '../../src';
import config = require('../../config');
import { SUCCESS_RESPONSE_BODY } from './shared/constants';
import { Role } from '../../src/databases/entities/Role';
import {
randomEmail,
randomValidPassword,
randomName,
randomInvalidPassword,
} from './shared/random';
import { CredentialsEntity } from '../../src/databases/entities/CredentialsEntity';
import { WorkflowEntity } from '../../src/databases/entities/WorkflowEntity';
import * as utils from './shared/utils';
import * as testDb from './shared/testDb';
let app: express.Application;
let testDbName = '';
let globalOwnerRole: Role;
let globalMemberRole: Role;
let workflowOwnerRole: Role;
let credentialOwnerRole: Role;
beforeAll(async () => {
app = utils.initTestServer({ endpointGroups: ['users'], applyAuth: true });
const initResult = await testDb.init();
testDbName = initResult.testDbName;
const [
fetchedGlobalOwnerRole,
fetchedGlobalMemberRole,
fetchedWorkflowOwnerRole,
fetchedCredentialOwnerRole,
] = await testDb.getAllRoles();
globalOwnerRole = fetchedGlobalOwnerRole;
globalMemberRole = fetchedGlobalMemberRole;
workflowOwnerRole = fetchedWorkflowOwnerRole;
credentialOwnerRole = fetchedCredentialOwnerRole;
utils.initTestTelemetry();
utils.initTestLogger();
});
beforeEach(async () => {
// do not combine calls - shared tables must be cleared first and separately
await testDb.truncate(['SharedCredentials', 'SharedWorkflow'], testDbName);
await testDb.truncate(['User', 'Workflow', 'Credentials'], testDbName);
jest.isolateModules(() => {
jest.mock('../../config');
});
await testDb.createUser({
id: INITIAL_TEST_USER.id,
email: INITIAL_TEST_USER.email,
password: INITIAL_TEST_USER.password,
firstName: INITIAL_TEST_USER.firstName,
lastName: INITIAL_TEST_USER.lastName,
globalRole: globalOwnerRole,
});
config.set('userManagement.disabled', false);
config.set('userManagement.isInstanceOwnerSetUp', true);
config.set('userManagement.emails.mode', '');
});
afterAll(async () => {
await testDb.terminate(testDbName);
});
test('GET /users should return all users', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
await testDb.createUser();
const response = await authOwnerAgent.get('/users');
expect(response.statusCode).toBe(200);
expect(response.body.data.length).toBe(2);
for (const user of response.body.data) {
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
globalRole,
password,
resetPasswordToken,
isPending,
} = user;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBeDefined();
expect(firstName).toBeDefined();
expect(lastName).toBeDefined();
expect(personalizationAnswers).toBeUndefined();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(false);
expect(globalRole).toBeDefined();
}
});
test('DELETE /users/:id should delete the user', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const userToDelete = await testDb.createUser();
const newWorkflow = new WorkflowEntity();
Object.assign(newWorkflow, {
name: randomName(),
active: false,
connections: {},
nodes: [],
});
const savedWorkflow = await Db.collections.Workflow!.save(newWorkflow);
await Db.collections.SharedWorkflow!.save({
role: workflowOwnerRole,
user: userToDelete,
workflow: savedWorkflow,
});
const newCredential = new CredentialsEntity();
Object.assign(newCredential, {
name: randomName(),
data: '',
type: '',
nodesAccess: [],
});
const savedCredential = await Db.collections.Credentials!.save(newCredential);
await Db.collections.SharedCredentials!.save({
role: credentialOwnerRole,
user: userToDelete,
credentials: savedCredential,
});
const response = await authOwnerAgent.delete(`/users/${userToDelete.id}`);
expect(response.statusCode).toBe(200);
expect(response.body).toEqual(SUCCESS_RESPONSE_BODY);
const user = await Db.collections.User!.findOne(userToDelete.id);
expect(user).toBeUndefined(); // deleted
const sharedWorkflow = await Db.collections.SharedWorkflow!.findOne({
relations: ['user'],
where: { user: userToDelete },
});
expect(sharedWorkflow).toBeUndefined(); // deleted
const sharedCredential = await Db.collections.SharedCredentials!.findOne({
relations: ['user'],
where: { user: userToDelete },
});
expect(sharedCredential).toBeUndefined(); // deleted
const workflow = await Db.collections.Workflow!.findOne(savedWorkflow.id);
expect(workflow).toBeUndefined(); // deleted
// TODO: Include active workflow and check whether webhook has been removed
const credential = await Db.collections.Credentials!.findOne(savedCredential.id);
expect(credential).toBeUndefined(); // deleted
});
test('DELETE /users/:id should fail to delete self', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.delete(`/users/${owner.id}`);
expect(response.statusCode).toBe(400);
const user = await Db.collections.User!.findOne(owner.id);
expect(user).toBeDefined();
});
test('DELETE /users/:id should fail if user to delete is transferee', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const { id: idToDelete } = await testDb.createUser();
const response = await authOwnerAgent.delete(`/users/${idToDelete}`).query({
transferId: idToDelete,
});
expect(response.statusCode).toBe(400);
const user = await Db.collections.User!.findOne(idToDelete);
expect(user).toBeDefined();
});
test('DELETE /users/:id with transferId should perform transfer', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const userToDelete = await Db.collections.User!.save({
id: uuid(),
email: randomEmail(),
password: randomValidPassword(),
firstName: randomName(),
lastName: randomName(),
createdAt: new Date(),
updatedAt: new Date(),
globalRole: workflowOwnerRole,
});
const newWorkflow = new WorkflowEntity();
Object.assign(newWorkflow, {
name: randomName(),
active: false,
connections: {},
nodes: [],
});
const savedWorkflow = await Db.collections.Workflow!.save(newWorkflow);
await Db.collections.SharedWorkflow!.save({
role: workflowOwnerRole,
user: userToDelete,
workflow: savedWorkflow,
});
const newCredential = new CredentialsEntity();
Object.assign(newCredential, {
name: randomName(),
data: '',
type: '',
nodesAccess: [],
});
const savedCredential = await Db.collections.Credentials!.save(newCredential);
await Db.collections.SharedCredentials!.save({
role: credentialOwnerRole,
user: userToDelete,
credentials: savedCredential,
});
const response = await authOwnerAgent.delete(`/users/${userToDelete.id}`).query({
transferId: owner.id,
});
expect(response.statusCode).toBe(200);
const sharedWorkflow = await Db.collections.SharedWorkflow!.findOneOrFail({
relations: ['user'],
where: { user: owner },
});
const sharedCredential = await Db.collections.SharedCredentials!.findOneOrFail({
relations: ['user'],
where: { user: owner },
});
const deletedUser = await Db.collections.User!.findOne(userToDelete);
expect(sharedWorkflow.user.id).toBe(owner.id);
expect(sharedCredential.user.id).toBe(owner.id);
expect(deletedUser).toBeUndefined();
});
test('GET /resolve-signup-token should validate invite token', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const { id: inviteeId } = await testDb.createMemberShell();
const response = await authOwnerAgent
.get('/resolve-signup-token')
.query({ inviterId: INITIAL_TEST_USER.id })
.query({ inviteeId });
expect(response.statusCode).toBe(200);
expect(response.body).toEqual({
data: {
inviter: {
firstName: INITIAL_TEST_USER.firstName,
lastName: INITIAL_TEST_USER.lastName,
},
},
});
});
test('GET /resolve-signup-token should fail with invalid inputs', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const { id: inviteeId } = await testDb.createUser();
const first = await authOwnerAgent
.get('/resolve-signup-token')
.query({ inviterId: INITIAL_TEST_USER.id });
const second = await authOwnerAgent.get('/resolve-signup-token').query({ inviteeId });
const third = await authOwnerAgent.get('/resolve-signup-token').query({
inviterId: '5531199e-b7ae-425b-a326-a95ef8cca59d',
inviteeId: 'cb133beb-7729-4c34-8cd1-a06be8834d9d',
});
// user is already set up, so call should error
const fourth = await authOwnerAgent
.get('/resolve-signup-token')
.query({ inviterId: INITIAL_TEST_USER.id })
.query({ inviteeId });
// cause inconsistent DB state
await Db.collections.User!.update(owner.id, { email: '' });
const fifth = await authOwnerAgent
.get('/resolve-signup-token')
.query({ inviterId: INITIAL_TEST_USER.id })
.query({ inviteeId });
for (const response of [first, second, third, fourth, fifth]) {
expect(response.statusCode).toBe(400);
}
});
test('POST /users/:id should fill out a user shell', async () => {
const authlessAgent = utils.createAgent(app);
const userToFillOut = await Db.collections.User!.save({
email: randomEmail(),
globalRole: globalMemberRole,
});
const newPassword = randomValidPassword();
const response = await authlessAgent.post(`/users/${userToFillOut.id}`).send({
inviterId: INITIAL_TEST_USER.id,
firstName: INITIAL_TEST_USER.firstName,
lastName: INITIAL_TEST_USER.lastName,
password: newPassword,
});
const {
id,
email,
firstName,
lastName,
personalizationAnswers,
password,
resetPasswordToken,
globalRole,
isPending,
} = response.body.data;
expect(validator.isUUID(id)).toBe(true);
expect(email).toBeDefined();
expect(firstName).toBe(INITIAL_TEST_USER.firstName);
expect(lastName).toBe(INITIAL_TEST_USER.lastName);
expect(personalizationAnswers).toBeNull();
expect(password).toBeUndefined();
expect(resetPasswordToken).toBeUndefined();
expect(isPending).toBe(false);
expect(globalRole).toBeDefined();
const authToken = utils.getAuthToken(response);
expect(authToken).toBeDefined();
const filledOutUser = await Db.collections.User!.findOneOrFail(userToFillOut.id);
expect(filledOutUser.firstName).toBe(INITIAL_TEST_USER.firstName);
expect(filledOutUser.lastName).toBe(INITIAL_TEST_USER.lastName);
expect(filledOutUser.password).not.toBe(newPassword);
});
test('POST /users/:id should fail with invalid inputs', async () => {
const authlessAgent = utils.createAgent(app);
const emailToStore = randomEmail();
const userToFillOut = await Db.collections.User!.save({
email: emailToStore,
globalRole: globalMemberRole,
});
for (const invalidPayload of INVALID_FILL_OUT_USER_PAYLOADS) {
const response = await authlessAgent.post(`/users/${userToFillOut.id}`).send(invalidPayload);
expect(response.statusCode).toBe(400);
const user = await Db.collections.User!.findOneOrFail({ where: { email: emailToStore } });
expect(user.firstName).toBeNull();
expect(user.lastName).toBeNull();
expect(user.password).toBeNull();
}
});
test('POST /users/:id should fail with already accepted invite', async () => {
const authlessAgent = utils.createAgent(app);
const globalMemberRole = await Db.collections.Role!.findOneOrFail({
name: 'member',
scope: 'global',
});
const shell = await Db.collections.User!.save({
email: randomEmail(),
password: randomValidPassword(), // simulate accepted invite
globalRole: globalMemberRole,
});
const newPassword = randomValidPassword();
const response = await authlessAgent.post(`/users/${shell.id}`).send({
inviterId: INITIAL_TEST_USER.id,
firstName: randomName(),
lastName: randomName(),
password: newPassword,
});
expect(response.statusCode).toBe(400);
const fetchedShell = await Db.collections.User!.findOneOrFail({ where: { email: shell.email } });
expect(fetchedShell.firstName).toBeNull();
expect(fetchedShell.lastName).toBeNull();
const comparisonResult = await compare(shell.password, newPassword);
expect(comparisonResult).toBe(false);
expect(newPassword).not.toBe(fetchedShell.password);
});
test('POST /users should fail if emailing is not set up', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const response = await authOwnerAgent.post('/users').send([{ email: randomEmail() }]);
expect(response.statusCode).toBe(500);
});
test('POST /users should fail if user management is disabled', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
config.set('userManagement.disabled', true);
const response = await authOwnerAgent.post('/users').send([{ email: randomEmail() }]);
expect(response.statusCode).toBe(500);
});
test('POST /users should email invites and create user shells', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
const {
user,
pass,
smtp: { host, port, secure },
} = await utils.getSmtpTestAccount();
config.set('userManagement.emails.mode', 'smtp');
config.set('userManagement.emails.smtp.host', host);
config.set('userManagement.emails.smtp.port', port);
config.set('userManagement.emails.smtp.secure', secure);
config.set('userManagement.emails.smtp.auth.user', user);
config.set('userManagement.emails.smtp.auth.pass', pass);
const payload = TEST_EMAILS_TO_CREATE_USER_SHELLS.map((e) => ({ email: e }));
const response = await authOwnerAgent.post('/users').send(payload);
expect(response.statusCode).toBe(200);
for (const {
user: { id, email: receivedEmail },
error,
} of response.body.data) {
expect(validator.isUUID(id)).toBe(true);
expect(TEST_EMAILS_TO_CREATE_USER_SHELLS.some((e) => e === receivedEmail)).toBe(true);
if (error) {
expect(error).toBe('Email could not be sent');
}
const user = await Db.collections.User!.findOneOrFail(id);
const { firstName, lastName, personalizationAnswers, password, resetPasswordToken } = user;
expect(firstName).toBeNull();
expect(lastName).toBeNull();
expect(personalizationAnswers).toBeNull();
expect(password).toBeNull();
expect(resetPasswordToken).toBeNull();
}
});
test('POST /users should fail with invalid inputs', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
config.set('userManagement.emails.mode', 'smtp');
const invalidPayloads = [
randomEmail(),
[randomEmail()],
{},
[{ name: randomName() }],
[{ email: randomName() }],
];
for (const invalidPayload of invalidPayloads) {
const response = await authOwnerAgent.post('/users').send(invalidPayload);
expect(response.statusCode).toBe(400);
const users = await Db.collections.User!.find();
expect(users.length).toBe(1); // DB unaffected
}
});
test('POST /users should ignore an empty payload', async () => {
const owner = await Db.collections.User!.findOneOrFail();
const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
config.set('userManagement.emails.mode', 'smtp');
const response = await authOwnerAgent.post('/users').send([]);
const { data } = response.body;
expect(response.statusCode).toBe(200);
expect(Array.isArray(data)).toBe(true);
expect(data.length).toBe(0);
const users = await Db.collections.User!.find();
expect(users.length).toBe(1);
});
// TODO: /users/:id/reinvite route tests missing
// TODO: UserManagementMailer is a singleton - cannot reinstantiate with wrong creds
// test('POST /users should error for wrong SMTP config', async () => {
// const owner = await Db.collections.User!.findOneOrFail();
// const authOwnerAgent = utils.createAgent(app, { auth: true, user: owner });
// config.set('userManagement.emails.mode', 'smtp');
// config.set('userManagement.emails.smtp.host', 'XYZ'); // break SMTP config
// const payload = TEST_EMAILS_TO_CREATE_USER_SHELLS.map((e) => ({ email: e }));
// const response = await authOwnerAgent.post('/users').send(payload);
// expect(response.statusCode).toBe(500);
// });
const INITIAL_TEST_USER = {
id: uuid(),
email: randomEmail(),
firstName: randomName(),
lastName: randomName(),
password: randomValidPassword(),
};
const INVALID_FILL_OUT_USER_PAYLOADS = [
{
firstName: randomName(),
lastName: randomName(),
password: randomValidPassword(),
},
{
inviterId: INITIAL_TEST_USER.id,
firstName: randomName(),
password: randomValidPassword(),
},
{
inviterId: INITIAL_TEST_USER.id,
firstName: randomName(),
password: randomValidPassword(),
},
{
inviterId: INITIAL_TEST_USER.id,
firstName: randomName(),
lastName: randomName(),
},
{
inviterId: INITIAL_TEST_USER.id,
firstName: randomName(),
lastName: randomName(),
password: randomInvalidPassword(),
},
];
const TEST_EMAILS_TO_CREATE_USER_SHELLS = [randomEmail(), randomEmail(), randomEmail()];

Some files were not shown because too many files have changed in this diff Show more