Merge remote-tracking branch 'origin/master' into pay-1745-update-moved-workflow-to-include-option-to-share-credentials

# Conflicts:
#	packages/editor-ui/src/components/Projects/ProjectMoveResourceModal.test.ts
This commit is contained in:
Csaba Tuncsik 2025-01-22 16:19:53 +01:00
commit 96f376db54
No known key found for this signature in database
90 changed files with 2589 additions and 956 deletions

View file

@ -1,3 +1,55 @@
# [1.76.0](https://github.com/n8n-io/n8n/compare/n8n@1.75.0...n8n@1.76.0) (2025-01-22)
### Bug Fixes
* **core:** Align saving behavior in `workflowExecuteAfter` hooks ([#12731](https://github.com/n8n-io/n8n/issues/12731)) ([9d76210](https://github.com/n8n-io/n8n/commit/9d76210a570e025d01d1f6596667abf40fbd8d12))
* **core:** AugmentObject should handle the constructor property correctly ([#12744](https://github.com/n8n-io/n8n/issues/12744)) ([36bc164](https://github.com/n8n-io/n8n/commit/36bc164da486f2e2d05091b457b8eea6521ca22e))
* **core:** Fix keyboard shortcuts for non-ansi layouts ([#12672](https://github.com/n8n-io/n8n/issues/12672)) ([4c8193f](https://github.com/n8n-io/n8n/commit/4c8193fedc2e3967c9a06c0652483128df509653))
* **core:** Fix license CLI commands showing incorrect renewal setting ([#12759](https://github.com/n8n-io/n8n/issues/12759)) ([024ada8](https://github.com/n8n-io/n8n/commit/024ada822c1bc40958e594bb08707cf77d3397ec))
* **core:** Fix license initialization failure on startup ([#12737](https://github.com/n8n-io/n8n/issues/12737)) ([ac2f647](https://github.com/n8n-io/n8n/commit/ac2f6476c114f51fafb9b7b66e41e0c87f4a1bf6))
* **core:** Recover successful data-less executions ([#12720](https://github.com/n8n-io/n8n/issues/12720)) ([a39b8bd](https://github.com/n8n-io/n8n/commit/a39b8bd32be50c8323e415f820b25b4bcb81d960))
* **core:** Remove run data of utility nodes for partial executions v2 ([#12673](https://github.com/n8n-io/n8n/issues/12673)) ([b66a9dc](https://github.com/n8n-io/n8n/commit/b66a9dc8fb6f7b19122cacbb7e2f86b4c921c3fb))
* **core:** Sync `hookFunctionsSave` and `hookFunctionsSaveWorker` ([#12740](https://github.com/n8n-io/n8n/issues/12740)) ([d410b8f](https://github.com/n8n-io/n8n/commit/d410b8f5a7e99658e1e8dcb2e02901bd01ce9c59))
* **core:** Update isDocker check to return true on kubernetes/containerd ([#12603](https://github.com/n8n-io/n8n/issues/12603)) ([c55dac6](https://github.com/n8n-io/n8n/commit/c55dac66ed97a2317d4c696c3b505790ec5d72fe))
* **editor:** Add unicode code points to expression language for emoji ([#12633](https://github.com/n8n-io/n8n/issues/12633)) ([819ebd0](https://github.com/n8n-io/n8n/commit/819ebd058d1d60b3663d92b4a652728da7134a3b))
* **editor:** Correct missing whitespace in JSON output ([#12677](https://github.com/n8n-io/n8n/issues/12677)) ([b098b19](https://github.com/n8n-io/n8n/commit/b098b19c7f0e3a9848c3fcfa012999050f2d3c7a))
* **editor:** Defer crypto.randomUUID call in CodeNodeEditor ([#12630](https://github.com/n8n-io/n8n/issues/12630)) ([58f6532](https://github.com/n8n-io/n8n/commit/58f6532630bacd288d3c0a79b40150f465898419))
* **editor:** Fix Code node bug erasing and overwriting code when switching between nodes ([#12637](https://github.com/n8n-io/n8n/issues/12637)) ([02d953d](https://github.com/n8n-io/n8n/commit/02d953db34ec4e44977a8ca908628b62cca82fde))
* **editor:** Fix execution list hover & selection colour in dark mode ([#12628](https://github.com/n8n-io/n8n/issues/12628)) ([95c40c0](https://github.com/n8n-io/n8n/commit/95c40c02cb8fef77cf633cf5aec08e98746cff36))
* **editor:** Fix JsonEditor with expressions ([#12739](https://github.com/n8n-io/n8n/issues/12739)) ([56c93ca](https://github.com/n8n-io/n8n/commit/56c93caae026738c1c0bebb4187b238e34a330f6))
* **editor:** Fix navbar height flickering during load ([#12738](https://github.com/n8n-io/n8n/issues/12738)) ([a96b3f0](https://github.com/n8n-io/n8n/commit/a96b3f0091798a52bb33107b919b5d8287ba7506))
* **editor:** Open chat when executing agent node in canvas v2 ([#12617](https://github.com/n8n-io/n8n/issues/12617)) ([457edd9](https://github.com/n8n-io/n8n/commit/457edd99bb853d8ccf3014605d5823933f3c0bc6))
* **editor:** Partial execution of a workflow with manual chat trigger ([#12662](https://github.com/n8n-io/n8n/issues/12662)) ([2f81b29](https://github.com/n8n-io/n8n/commit/2f81b29d341535b512df0aa01b25a91d109f113f))
* **editor:** Show connector label above the line when it's straight ([#12622](https://github.com/n8n-io/n8n/issues/12622)) ([c97bd48](https://github.com/n8n-io/n8n/commit/c97bd48a77643b9c2a5d7218e21b957af15cee0b))
* **editor:** Show run workflow button when chat trigger has pinned data ([#12616](https://github.com/n8n-io/n8n/issues/12616)) ([da8aafc](https://github.com/n8n-io/n8n/commit/da8aafc0e3a1b5d862f0723d0d53d2c38bcaebc3))
* **editor:** Update workflow re-initialization to use query parameter ([#12650](https://github.com/n8n-io/n8n/issues/12650)) ([982131a](https://github.com/n8n-io/n8n/commit/982131a75a32f741c120156826c303989aac189c))
* **Execute Workflow Node:** Pass binary data to sub-workflow ([#12635](https://github.com/n8n-io/n8n/issues/12635)) ([e9c152e](https://github.com/n8n-io/n8n/commit/e9c152e369a4c2762bd8e6ad17eaa704bb3771bb))
* **Google Gemini Chat Model Node:** Add base URL support for Google Gemini Chat API ([#12643](https://github.com/n8n-io/n8n/issues/12643)) ([14f4bc7](https://github.com/n8n-io/n8n/commit/14f4bc769027789513808b4000444edf99dc5d1c))
* **GraphQL Node:** Change default request format to json instead of graphql ([#11346](https://github.com/n8n-io/n8n/issues/11346)) ([c7c122f](https://github.com/n8n-io/n8n/commit/c7c122f9173df824cc1b5ab864333bffd0d31f82))
* **Jira Software Node:** Get custom fields(RLC) in update operation for server deployment type ([#12719](https://github.com/n8n-io/n8n/issues/12719)) ([353df79](https://github.com/n8n-io/n8n/commit/353df7941117e20547cd4f3fc514979a54619720))
* **n8n Form Node:** Remove the ability to change the formatting of dates ([#12666](https://github.com/n8n-io/n8n/issues/12666)) ([14904ff](https://github.com/n8n-io/n8n/commit/14904ff77951fef23eb789a43947492a4cd3fa20))
* **OpenAI Chat Model Node:** Fix loading of custom models when using custom credential URL ([#12634](https://github.com/n8n-io/n8n/issues/12634)) ([7cc553e](https://github.com/n8n-io/n8n/commit/7cc553e3b277a16682bfca1ea08cb98178e38580))
* **OpenAI Chat Model Node:** Restore default model value ([#12745](https://github.com/n8n-io/n8n/issues/12745)) ([d1b6692](https://github.com/n8n-io/n8n/commit/d1b6692736182fa2eab768ba3ad0adb8504ebbbd))
* **Postgres Chat Memory Node:** Do not terminate the connection pool ([#12674](https://github.com/n8n-io/n8n/issues/12674)) ([e7f00bc](https://github.com/n8n-io/n8n/commit/e7f00bcb7f2dce66ca07a9322d50f96356c1a43d))
* **Postgres Node:** Allow using composite key in upsert queries ([#12639](https://github.com/n8n-io/n8n/issues/12639)) ([83ce3a9](https://github.com/n8n-io/n8n/commit/83ce3a90963ba76601234f4314363a8ccc310f0f))
* **Wait Node:** Fix for hasNextPage in waiting forms ([#12636](https://github.com/n8n-io/n8n/issues/12636)) ([652b8d1](https://github.com/n8n-io/n8n/commit/652b8d170b9624d47b5f2d8d679c165cc14ea548))
### Features
* Add credential only node for Microsoft Azure Monitor ([#12645](https://github.com/n8n-io/n8n/issues/12645)) ([6ef8882](https://github.com/n8n-io/n8n/commit/6ef8882a108c672ab097c9dd1c590d4e9e7f3bcc))
* Add Miro credential only node ([#12746](https://github.com/n8n-io/n8n/issues/12746)) ([5b29086](https://github.com/n8n-io/n8n/commit/5b29086e2f9b7f638fac4440711f673438e57492))
* Add SSM endpoint to AWS credentials ([#12212](https://github.com/n8n-io/n8n/issues/12212)) ([565c7b8](https://github.com/n8n-io/n8n/commit/565c7b8b9cfd3e10f6a2c60add96fea4c4d95d33))
* **core:** Enable task runner by default ([#12726](https://github.com/n8n-io/n8n/issues/12726)) ([9e2a01a](https://github.com/n8n-io/n8n/commit/9e2a01aeaf36766a1cf7a1d9a4d6e02f45739bd3))
* **editor:** Force final canvas v2 migration and remove switcher from UI ([#12717](https://github.com/n8n-io/n8n/issues/12717)) ([29335b9](https://github.com/n8n-io/n8n/commit/29335b9b6acf97c817bea70688e8a2786fbd8889))
* **editor:** VariablesView Reskin - Add Filters for missing values ([#12611](https://github.com/n8n-io/n8n/issues/12611)) ([1eeb788](https://github.com/n8n-io/n8n/commit/1eeb788d327287d21eab7ad6f2156453ab7642c7))
* **Jira Software Node:** Personal Access Token credential type ([#11038](https://github.com/n8n-io/n8n/issues/11038)) ([1c7a38f](https://github.com/n8n-io/n8n/commit/1c7a38f6bab108daa47401cd98c185590bf299a8))
* **n8n Form Trigger Node:** Form Improvements ([#12590](https://github.com/n8n-io/n8n/issues/12590)) ([f167578](https://github.com/n8n-io/n8n/commit/f167578b3251e553a4d000e731e1bb60348916ad))
* Synchronize deletions when pulling from source control ([#12170](https://github.com/n8n-io/n8n/issues/12170)) ([967ee4b](https://github.com/n8n-io/n8n/commit/967ee4b89b94b92fc3955c56bf4c9cca0bd64eac))
# [1.75.0](https://github.com/n8n-io/n8n/compare/n8n@1.74.0...n8n@1.75.0) (2025-01-15)

View file

@ -105,7 +105,7 @@ describe('Expression editor modal', () => {
// Run workflow
cy.get('body').type('{esc}');
ndv.actions.close();
WorkflowPage.actions.executeNode('No Operation');
WorkflowPage.actions.executeNode('No Operation, do nothing', { anchor: 'topLeft' });
WorkflowPage.actions.openNode('Hacker News');
WorkflowPage.actions.openExpressionEditorModal();

View file

@ -1,6 +1,6 @@
{
"name": "n8n-monorepo",
"version": "1.75.0",
"version": "1.76.0",
"private": true,
"engines": {
"node": ">=20.15",

View file

@ -27,7 +27,7 @@ docker run ghcr.io/n8n-io/n8n-benchmark:latest run \
--n8nUserPassword=InstanceOwnerPassword \
--vus=5 \
--duration=1m \
--scenarioFilter SingleWebhook
--scenarioFilter=single-webhook
```
### Using custom scenarios with the Docker image

View file

@ -1,6 +1,6 @@
{
"name": "@n8n/n8n-benchmark",
"version": "1.9.0",
"version": "1.10.0",
"description": "Cli for running benchmark tests for n8n",
"main": "dist/index",
"scripts": {

View file

@ -15,7 +15,7 @@ entity { Plaintext | Resolvable }
resolvableChar { unicodeChar | "}" ![}] | "\\}}" }
unicodeChar { $[\u0000-\u007C] | $[\u007E-\u20CF] | $[\u{1F300}-\u{1F64F}] | $[\u4E00-\u9FFF] }
unicodeChar { $[\u0000-\u007C] | $[\u007E-\u20CF] | $[\u{1F300}-\u{1FAF8}] | $[\u4E00-\u9FFF] }
}
@detectDelim

View file

@ -10,7 +10,7 @@ export const parser = LRParser.deserialize({
skippedNodes: [0],
repeatNodeCount: 1,
tokenData:
"&U~RTO#ob#o#p!h#p;'Sb;'S;=`!]<%lOb~gTQ~O#ob#o#pv#p;'Sb;'S;=`!]<%lOb~yUO#ob#p;'Sb;'S;=`!]<%l~b~Ob~~!c~!`P;=`<%lb~!hOQ~~!kVO#ob#o#p#Q#p;'Sb;'S;=`!]<%l~b~Ob~~!c~#TWO#O#Q#O#P#m#P#q#Q#q#r%Z#r$Ml#Q*5S41d#Q;(b;(c%x;(c;(d&O~#pWO#O#Q#O#P#m#P#q#Q#q#r$Y#r$Ml#Q*5S41d#Q;(b;(c%x;(c;(d&O~$]TO#q#Q#q#r$l#r;'S#Q;'S;=`%r<%lO#Q~$qWR~O#O#Q#O#P#m#P#q#Q#q#r%Z#r$Ml#Q*5S41d#Q;(b;(c%x;(c;(d&O~%^TO#q#Q#q#r%m#r;'S#Q;'S;=`%r<%lO#Q~%rOR~~%uP;=`<%l#Q~%{P;NQ<%l#Q~&RP;=`;JY#Q",
"&_~RTO#ob#o#p!h#p;'Sb;'S;=`!]<%lOb~gTQ~O#ob#o#pv#p;'Sb;'S;=`!]<%lOb~yUO#ob#p;'Sb;'S;=`!]<%l~b~Ob~~!c~!`P;=`<%lb~!hOQ~~!kVO#ob#o#p#Q#p;'Sb;'S;=`!]<%l~b~Ob~~!c~#TXO#O#Q#O#P#p#P#q#Q#q#r%d#r$Ml#Q*5S41d#Q;(b;(c&R;(c;(d%{;(d;(e&X~#sXO#O#Q#O#P#p#P#q#Q#q#r$`#r$Ml#Q*5S41d#Q;(b;(c&R;(c;(d%{;(d;(e&X~$cTO#q#Q#q#r$r#r;'S#Q;'S;=`%{<%lO#Q~$wXR~O#O#Q#O#P#p#P#q#Q#q#r%d#r$Ml#Q*5S41d#Q;(b;(c&R;(c;(d%{;(d;(e&X~%gTO#q#Q#q#r%v#r;'S#Q;'S;=`%{<%lO#Q~%{OR~~&OP;=`<%l#Q~&UP;NQ<%l#Q~&[P;=`;My#Q",
tokenizers: [0],
topRules: { Program: [0, 1] },
tokenPrec: 0,

View file

@ -277,3 +277,19 @@ Program(Resolvable)
==>
Program(Resolvable)
# Resolvable with new emoji range
{{ '🟢' }}
==>
Program(Resolvable)
# Resolvable with new emoji range end of range
{{ '🫸' }}
==>
Program(Resolvable)

View file

@ -1,6 +1,6 @@
{
"name": "@n8n/config",
"version": "1.25.0",
"version": "1.26.0",
"scripts": {
"clean": "rimraf dist .turbo",
"dev": "pnpm watch",

View file

@ -12,7 +12,6 @@ export class TaskRunnersConfig {
@Env('N8N_RUNNERS_ENABLED')
enabled: boolean = false;
// Defaults to true for now
@Env('N8N_RUNNERS_MODE')
mode: TaskRunnerMode = 'internal';
@ -23,12 +22,12 @@ export class TaskRunnersConfig {
@Env('N8N_RUNNERS_AUTH_TOKEN')
authToken: string = '';
/** IP address task runners server should listen on */
@Env('N8N_RUNNERS_HEALTH_CHECK_SERVER_PORT')
/** IP address task runners broker should listen on */
@Env('N8N_RUNNERS_BROKER_PORT')
port: number = 5679;
/** IP address task runners server should listen on */
@Env('N8N_RUNNERS_SERVER_LISTEN_ADDRESS')
/** IP address task runners broker should listen on */
@Env('N8N_RUNNERS_BROKER_LISTEN_ADDRESS')
listenAddress: string = '127.0.0.1';
/** Maximum size of a payload sent to the runner in bytes, Default 1G */

View file

@ -147,7 +147,7 @@ export class LmChatOpenAi implements INodeType {
displayName: 'Model',
name: 'model',
type: 'resourceLocator',
default: { mode: 'list', value: '' },
default: { mode: 'list', value: 'gpt-4o-mini' },
required: true,
modes: [
{
@ -164,7 +164,7 @@ export class LmChatOpenAi implements INodeType {
displayName: 'ID',
name: 'id',
type: 'string',
placeholder: '2302163813',
placeholder: 'gpt-4o-mini',
},
],
description: 'The model. Choose from the list, or specify an ID.',

View file

@ -1,6 +1,6 @@
{
"name": "@n8n/n8n-nodes-langchain",
"version": "1.75.0",
"version": "1.76.0",
"description": "",
"main": "index.js",
"scripts": {

View file

@ -1,6 +1,6 @@
{
"name": "@n8n/task-runner",
"version": "1.13.0",
"version": "1.14.0",
"scripts": {
"clean": "rimraf dist .turbo",
"start": "node dist/start.js",

View file

@ -1,6 +1,6 @@
{
"name": "n8n",
"version": "1.75.0",
"version": "1.76.0",
"description": "n8n Workflow Automation Tool",
"main": "dist/index",
"types": "dist/index.d.ts",

View file

@ -251,6 +251,20 @@ describe('License', () => {
expect(LicenseManager).toHaveBeenCalledWith(expect.objectContaining(expectedRenewalSettings));
});
it('when CLI command with N8N_LICENSE_AUTO_RENEW_ENABLED=true, should enable renewal', async () => {
const globalConfig = mock<GlobalConfig>({
license: { ...licenseConfig, autoRenewalEnabled: true },
});
await new License(mockLogger(), mock(), mock(), mock(), globalConfig).init({
isCli: true,
});
expect(LicenseManager).toHaveBeenCalledWith(
expect.objectContaining({ autoRenewEnabled: true, renewOnInit: true }),
);
});
});
describe('reinit', () => {
@ -262,7 +276,7 @@ describe('License', () => {
await license.reinit();
expect(initSpy).toHaveBeenCalledWith(true);
expect(initSpy).toHaveBeenCalledWith({ forceRecreate: true });
expect(LicenseManager.prototype.reset).toHaveBeenCalled();
expect(LicenseManager.prototype.initialize).toHaveBeenCalled();

View file

@ -7,6 +7,7 @@ import { readFile } from 'fs/promises';
import type { Server } from 'http';
import isbot from 'isbot';
import { Logger } from 'n8n-core';
import path from 'path';
import config from '@/config';
import { N8N_VERSION, TEMPLATES_DIR, inDevelopment, inTest } from '@/constants';
@ -67,6 +68,9 @@ export abstract class AbstractServer {
this.app.set('view engine', 'handlebars');
this.app.set('views', TEMPLATES_DIR);
const assetsPath: string = path.join(__dirname, '../../../assets');
this.app.use(express.static(assetsPath));
const proxyHops = config.getEnv('proxy_hops');
if (proxyHops > 0) this.app.set('trust proxy', proxyHops);

View file

@ -40,6 +40,7 @@ import {
import type { WorkflowEntity } from '@/databases/entities/workflow-entity';
import { WorkflowRepository } from '@/databases/repositories/workflow.repository';
import { OnShutdown } from '@/decorators/on-shutdown';
import { executeErrorWorkflow } from '@/execution-lifecycle/execute-error-workflow';
import { ExecutionService } from '@/executions/execution.service';
import { ExternalHooks } from '@/external-hooks';
import type { IWorkflowDb } from '@/interfaces';
@ -400,7 +401,7 @@ export class ActiveWorkflowManager {
status: 'running',
};
WorkflowExecuteAdditionalData.executeErrorWorkflow(workflowData, fullRunData, mode);
executeErrorWorkflow(workflowData, fullRunData, mode);
}
/**

View file

@ -16,7 +16,7 @@ export class ClearLicenseCommand extends BaseCommand {
// Attempt to invoke shutdown() to force any floating entitlements to be released
const license = Container.get(License);
await license.init();
await license.init({ isCli: true });
try {
await license.shutdown();
} catch {

View file

@ -11,7 +11,7 @@ export class LicenseInfoCommand extends BaseCommand {
async run() {
const license = Container.get(License);
await license.init();
await license.init({ isCli: true });
this.logger.info('Printing license information:\n' + license.getInfo());
}

View file

@ -255,7 +255,7 @@ describe('OAuth2CredentialController', () => {
type: 'oAuth2Api',
}),
);
expect(res.render).toHaveBeenCalledWith('oauth-callback');
expect(res.render).toHaveBeenCalledWith('oauth-callback', { imagePath: 'n8n-logo.png' });
});
it('merges oauthTokenData if it already exists', async () => {
@ -297,7 +297,7 @@ describe('OAuth2CredentialController', () => {
type: 'oAuth2Api',
}),
);
expect(res.render).toHaveBeenCalledWith('oauth-callback');
expect(res.render).toHaveBeenCalledWith('oauth-callback', { imagePath: 'n8n-logo.png' });
});
it('overwrites oauthTokenData if it is a string', async () => {
@ -335,7 +335,7 @@ describe('OAuth2CredentialController', () => {
type: 'oAuth2Api',
}),
);
expect(res.render).toHaveBeenCalledWith('oauth-callback');
expect(res.render).toHaveBeenCalledWith('oauth-callback', { imagePath: 'n8n-logo.png' });
});
});
});

View file

@ -149,7 +149,7 @@ export class OAuth2CredentialController extends AbstractOAuthController {
credentialId: credential.id,
});
return res.render('oauth-callback');
return res.render('oauth-callback', { imagePath: 'n8n-logo.png' });
} catch (error) {
return this.renderCallbackError(
res,

View file

@ -15,7 +15,7 @@ import { SharedWorkflowRepository } from '@/databases/repositories/shared-workfl
import { WorkflowRepository } from '@/databases/repositories/workflow.repository';
import { EventService } from '@/events/event.service';
import type { RelayEventMap } from '@/events/maps/relay.event-map';
import { determineFinalExecutionStatus } from '@/execution-lifecycle-hooks/shared/shared-hook-functions';
import { determineFinalExecutionStatus } from '@/execution-lifecycle/shared/shared-hook-functions';
import type { IExecutionTrackProperties } from '@/interfaces';
import { License } from '@/license';
import { NodeTypes } from '@/node-types';

View file

@ -26,8 +26,9 @@ import { mockInstance } from '@test/mocking';
import {
getWorkflowHooksMain,
getWorkflowHooksWorkerExecuter,
getWorkflowHooksWorkerMain,
} from '../workflow-execute-additional-data';
} from '../execution-lifecycle-hooks';
describe('Execution Lifecycle Hooks', () => {
mockInstance(Logger);
@ -532,4 +533,85 @@ describe('Execution Lifecycle Hooks', () => {
});
});
});
describe('getWorkflowHooksWorkerExecuter', () => {
let hooks: WorkflowHooks;
beforeEach(() => {
hooks = getWorkflowHooksWorkerExecuter(executionMode, executionId, workflowData, {
pushRef,
retryOf,
});
});
describe('saving static data', () => {
it('should skip saving static data for manual executions', async () => {
hooks.mode = 'manual';
await hooks.executeHookFunctions('workflowExecuteAfter', [successfulRun, staticData]);
expect(workflowStaticDataService.saveStaticDataById).not.toHaveBeenCalled();
});
it('should save static data for prod executions', async () => {
hooks.mode = 'trigger';
await hooks.executeHookFunctions('workflowExecuteAfter', [successfulRun, staticData]);
expect(workflowStaticDataService.saveStaticDataById).toHaveBeenCalledWith(
workflowId,
staticData,
);
});
it('should handle static data saving errors', async () => {
hooks.mode = 'trigger';
const error = new Error('Static data save failed');
workflowStaticDataService.saveStaticDataById.mockRejectedValueOnce(error);
await hooks.executeHookFunctions('workflowExecuteAfter', [successfulRun, staticData]);
expect(errorReporter.error).toHaveBeenCalledWith(error);
});
});
describe('error workflow', () => {
it('should not execute error workflow for manual executions', async () => {
hooks.mode = 'manual';
await hooks.executeHookFunctions('workflowExecuteAfter', [failedRun, {}]);
expect(workflowExecutionService.executeErrorWorkflow).not.toHaveBeenCalled();
});
it('should execute error workflow for failed non-manual executions', async () => {
hooks.mode = 'trigger';
const errorWorkflow = 'error-workflow-id';
workflowData.settings = { errorWorkflow };
const project = mock<Project>();
ownershipService.getWorkflowProjectCached.calledWith(workflowId).mockResolvedValue(project);
await hooks.executeHookFunctions('workflowExecuteAfter', [failedRun, {}]);
expect(workflowExecutionService.executeErrorWorkflow).toHaveBeenCalledWith(
errorWorkflow,
{
workflow: {
id: workflowId,
name: workflowData.name,
},
execution: {
id: executionId,
error: expressionError,
mode: 'trigger',
retryOf,
lastNodeExecuted: undefined,
url: `http://localhost:5678/workflow/${workflowId}/executions/${executionId}`,
},
},
project,
);
});
});
});
});

View file

@ -2,7 +2,7 @@ import { BinaryDataService } from 'n8n-core';
import type { IRun } from 'n8n-workflow';
import config from '@/config';
import { restoreBinaryDataId } from '@/execution-lifecycle-hooks/restore-binary-data-id';
import { restoreBinaryDataId } from '@/execution-lifecycle/restore-binary-data-id';
import { mockInstance } from '@test/mocking';
function toIRun(item?: object) {

View file

@ -3,11 +3,12 @@ import { Logger } from 'n8n-core';
import type { IRunExecutionData, ITaskData, IWorkflowBase } from 'n8n-workflow';
import { ExecutionRepository } from '@/databases/repositories/execution.repository';
import { saveExecutionProgress } from '@/execution-lifecycle-hooks/save-execution-progress';
import * as fnModule from '@/execution-lifecycle-hooks/to-save-settings';
import type { IExecutionResponse } from '@/interfaces';
import { mockInstance } from '@test/mocking';
import { saveExecutionProgress } from '../save-execution-progress';
import * as fnModule from '../to-save-settings';
mockInstance(Logger);
const errorReporter = mockInstance(ErrorReporter);
const executionRepository = mockInstance(ExecutionRepository);

View file

@ -1,5 +1,6 @@
import config from '@/config';
import { toSaveSettings } from '@/execution-lifecycle-hooks/to-save-settings';
import { toSaveSettings } from '../to-save-settings';
afterEach(() => {
config.load(config.default);

View file

@ -0,0 +1,130 @@
import { GlobalConfig } from '@n8n/config';
import { Container } from '@n8n/di';
import { ErrorReporter, Logger } from 'n8n-core';
import type { IRun, IWorkflowBase, WorkflowExecuteMode } from 'n8n-workflow';
import type { IWorkflowErrorData } from '@/interfaces';
import { OwnershipService } from '@/services/ownership.service';
import { UrlService } from '@/services/url.service';
import { WorkflowExecutionService } from '@/workflows/workflow-execution.service';
/**
* Checks if there was an error and if errorWorkflow or a trigger is defined. If so it collects
* all the data and executes it
*
* @param {IWorkflowBase} workflowData The workflow which got executed
* @param {IRun} fullRunData The run which produced the error
* @param {WorkflowExecuteMode} mode The mode in which the workflow got started in
* @param {string} [executionId] The id the execution got saved as
*/
export function executeErrorWorkflow(
workflowData: IWorkflowBase,
fullRunData: IRun,
mode: WorkflowExecuteMode,
executionId?: string,
retryOf?: string,
): void {
const logger = Container.get(Logger);
// Check if there was an error and if so if an errorWorkflow or a trigger is set
let pastExecutionUrl: string | undefined;
if (executionId !== undefined) {
pastExecutionUrl = `${Container.get(UrlService).getWebhookBaseUrl()}workflow/${
workflowData.id
}/executions/${executionId}`;
}
if (fullRunData.data.resultData.error !== undefined) {
let workflowErrorData: IWorkflowErrorData;
const workflowId = workflowData.id;
if (executionId) {
// The error did happen in an execution
workflowErrorData = {
execution: {
id: executionId,
url: pastExecutionUrl,
error: fullRunData.data.resultData.error,
lastNodeExecuted: fullRunData.data.resultData.lastNodeExecuted!,
mode,
retryOf,
},
workflow: {
id: workflowId,
name: workflowData.name,
},
};
} else {
// The error did happen in a trigger
workflowErrorData = {
trigger: {
error: fullRunData.data.resultData.error,
mode,
},
workflow: {
id: workflowId,
name: workflowData.name,
},
};
}
const { errorTriggerType } = Container.get(GlobalConfig).nodes;
// Run the error workflow
// To avoid an infinite loop do not run the error workflow again if the error-workflow itself failed and it is its own error-workflow.
const { errorWorkflow } = workflowData.settings ?? {};
if (errorWorkflow && !(mode === 'error' && workflowId && errorWorkflow === workflowId)) {
logger.debug('Start external error workflow', {
executionId,
errorWorkflowId: errorWorkflow,
workflowId,
});
// If a specific error workflow is set run only that one
// First, do permission checks.
if (!workflowId) {
// Manual executions do not trigger error workflows
// So this if should never happen. It was added to
// make sure there are no possible security gaps
return;
}
Container.get(OwnershipService)
.getWorkflowProjectCached(workflowId)
.then((project) => {
void Container.get(WorkflowExecutionService).executeErrorWorkflow(
errorWorkflow,
workflowErrorData,
project,
);
})
.catch((error: Error) => {
Container.get(ErrorReporter).error(error);
logger.error(
`Could not execute ErrorWorkflow for execution ID ${executionId} because of error querying the workflow owner`,
{
executionId,
errorWorkflowId: errorWorkflow,
workflowId,
error,
workflowErrorData,
},
);
});
} else if (
mode !== 'error' &&
workflowId !== undefined &&
workflowData.nodes.some((node) => node.type === errorTriggerType)
) {
logger.debug('Start internal error workflow', { executionId, workflowId });
void Container.get(OwnershipService)
.getWorkflowProjectCached(workflowId)
.then((project) => {
void Container.get(WorkflowExecutionService).executeErrorWorkflow(
workflowId,
workflowErrorData,
project,
);
});
}
}
}

View file

@ -0,0 +1,628 @@
import { Container } from '@n8n/di';
import { stringify } from 'flatted';
import { ErrorReporter, Logger, InstanceSettings } from 'n8n-core';
import { WorkflowHooks } from 'n8n-workflow';
import type {
IDataObject,
INode,
IRun,
IRunExecutionData,
ITaskData,
IWorkflowBase,
IWorkflowExecuteHooks,
IWorkflowHooksOptionalParameters,
WorkflowExecuteMode,
IWorkflowExecutionDataProcess,
Workflow,
} from 'n8n-workflow';
import { ExecutionRepository } from '@/databases/repositories/execution.repository';
import { EventService } from '@/events/event.service';
import { ExternalHooks } from '@/external-hooks';
import { Push } from '@/push';
import { WorkflowStatisticsService } from '@/services/workflow-statistics.service';
import { isWorkflowIdValid } from '@/utils';
import { WorkflowStaticDataService } from '@/workflows/workflow-static-data.service';
import { executeErrorWorkflow } from './execute-error-workflow';
import { restoreBinaryDataId } from './restore-binary-data-id';
import { saveExecutionProgress } from './save-execution-progress';
import {
determineFinalExecutionStatus,
prepareExecutionDataForDbUpdate,
updateExistingExecution,
} from './shared/shared-hook-functions';
import { toSaveSettings } from './to-save-settings';
/**
* Returns hook functions to push data to Editor-UI
*/
function hookFunctionsPush(): IWorkflowExecuteHooks {
const logger = Container.get(Logger);
const pushInstance = Container.get(Push);
return {
nodeExecuteBefore: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { pushRef, executionId } = this;
// Push data to session which started workflow before each
// node which starts rendering
if (pushRef === undefined) {
return;
}
logger.debug(`Executing hook on node "${nodeName}" (hookFunctionsPush)`, {
executionId,
pushRef,
workflowId: this.workflowData.id,
});
pushInstance.send({ type: 'nodeExecuteBefore', data: { executionId, nodeName } }, pushRef);
},
],
nodeExecuteAfter: [
async function (this: WorkflowHooks, nodeName: string, data: ITaskData): Promise<void> {
const { pushRef, executionId } = this;
// Push data to session which started workflow after each rendered node
if (pushRef === undefined) {
return;
}
logger.debug(`Executing hook on node "${nodeName}" (hookFunctionsPush)`, {
executionId,
pushRef,
workflowId: this.workflowData.id,
});
pushInstance.send(
{ type: 'nodeExecuteAfter', data: { executionId, nodeName, data } },
pushRef,
);
},
],
workflowExecuteBefore: [
async function (this: WorkflowHooks, _workflow, data): Promise<void> {
const { pushRef, executionId } = this;
const { id: workflowId, name: workflowName } = this.workflowData;
logger.debug('Executing hook (hookFunctionsPush)', {
executionId,
pushRef,
workflowId,
});
// Push data to session which started the workflow
if (pushRef === undefined) {
return;
}
pushInstance.send(
{
type: 'executionStarted',
data: {
executionId,
mode: this.mode,
startedAt: new Date(),
retryOf: this.retryOf,
workflowId,
workflowName,
flattedRunData: data?.resultData.runData
? stringify(data.resultData.runData)
: stringify({}),
},
},
pushRef,
);
},
],
workflowExecuteAfter: [
async function (this: WorkflowHooks, fullRunData: IRun): Promise<void> {
const { pushRef, executionId } = this;
if (pushRef === undefined) return;
const { id: workflowId } = this.workflowData;
logger.debug('Executing hook (hookFunctionsPush)', {
executionId,
pushRef,
workflowId,
});
const { status } = fullRunData;
if (status === 'waiting') {
pushInstance.send({ type: 'executionWaiting', data: { executionId } }, pushRef);
} else {
const rawData = stringify(fullRunData.data);
pushInstance.send(
{ type: 'executionFinished', data: { executionId, workflowId, status, rawData } },
pushRef,
);
}
},
],
};
}
function hookFunctionsPreExecute(): IWorkflowExecuteHooks {
const externalHooks = Container.get(ExternalHooks);
return {
workflowExecuteBefore: [
async function (this: WorkflowHooks, workflow: Workflow): Promise<void> {
await externalHooks.run('workflow.preExecute', [workflow, this.mode]);
},
],
nodeExecuteAfter: [
async function (
this: WorkflowHooks,
nodeName: string,
data: ITaskData,
executionData: IRunExecutionData,
): Promise<void> {
await saveExecutionProgress(
this.workflowData,
this.executionId,
nodeName,
data,
executionData,
this.pushRef,
);
},
],
};
}
/**
* Returns hook functions to save workflow execution and call error workflow
*/
function hookFunctionsSave(): IWorkflowExecuteHooks {
const logger = Container.get(Logger);
const workflowStatisticsService = Container.get(WorkflowStatisticsService);
const eventService = Container.get(EventService);
return {
nodeExecuteBefore: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-pre-execute', { executionId, workflow, nodeName });
},
],
nodeExecuteAfter: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-post-execute', { executionId, workflow, nodeName });
},
],
workflowExecuteBefore: [],
workflowExecuteAfter: [
async function (
this: WorkflowHooks,
fullRunData: IRun,
newStaticData: IDataObject,
): Promise<void> {
logger.debug('Executing hook (hookFunctionsSave)', {
executionId: this.executionId,
workflowId: this.workflowData.id,
});
await restoreBinaryDataId(fullRunData, this.executionId, this.mode);
const isManualMode = this.mode === 'manual';
try {
if (!isManualMode && isWorkflowIdValid(this.workflowData.id) && newStaticData) {
// Workflow is saved so update in database
try {
await Container.get(WorkflowStaticDataService).saveStaticDataById(
this.workflowData.id,
newStaticData,
);
} catch (e) {
Container.get(ErrorReporter).error(e);
logger.error(
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
`There was a problem saving the workflow with id "${this.workflowData.id}" to save changed staticData: "${e.message}" (hookFunctionsSave)`,
{ executionId: this.executionId, workflowId: this.workflowData.id },
);
}
}
const executionStatus = determineFinalExecutionStatus(fullRunData);
fullRunData.status = executionStatus;
const saveSettings = toSaveSettings(this.workflowData.settings);
if (isManualMode && !saveSettings.manual && !fullRunData.waitTill) {
/**
* When manual executions are not being saved, we only soft-delete
* the execution so that the user can access its binary data
* while building their workflow.
*
* The manual execution and its binary data will be hard-deleted
* on the next pruning cycle after the grace period set by
* `EXECUTIONS_DATA_HARD_DELETE_BUFFER`.
*/
await Container.get(ExecutionRepository).softDelete(this.executionId);
return;
}
const shouldNotSave =
(executionStatus === 'success' && !saveSettings.success) ||
(executionStatus !== 'success' && !saveSettings.error);
if (shouldNotSave && !fullRunData.waitTill && !isManualMode) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
await Container.get(ExecutionRepository).hardDelete({
workflowId: this.workflowData.id,
executionId: this.executionId,
});
return;
}
// Although it is treated as IWorkflowBase here, it's being instantiated elsewhere with properties that may be sensitive
// As a result, we should create an IWorkflowBase object with only the data we want to save in it.
const fullExecutionData = prepareExecutionDataForDbUpdate({
runData: fullRunData,
workflowData: this.workflowData,
workflowStatusFinal: executionStatus,
retryOf: this.retryOf,
});
// When going into the waiting state, store the pushRef in the execution-data
if (fullRunData.waitTill && isManualMode) {
fullExecutionData.data.pushRef = this.pushRef;
}
await updateExistingExecution({
executionId: this.executionId,
workflowId: this.workflowData.id,
executionData: fullExecutionData,
});
if (!isManualMode) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
}
} catch (error) {
Container.get(ErrorReporter).error(error);
logger.error(`Failed saving execution data to DB on execution ID ${this.executionId}`, {
executionId: this.executionId,
workflowId: this.workflowData.id,
// eslint-disable-next-line @typescript-eslint/no-unsafe-assignment
error,
});
if (!isManualMode) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
}
} finally {
workflowStatisticsService.emit('workflowExecutionCompleted', {
workflowData: this.workflowData,
fullRunData,
});
}
},
],
nodeFetchedData: [
async (workflowId: string, node: INode) => {
workflowStatisticsService.emit('nodeFetchedData', { workflowId, node });
},
],
};
}
/**
* Returns hook functions to save workflow execution and call error workflow
* for running with queues. Manual executions should never run on queues as
* they are always executed in the main process.
*/
function hookFunctionsSaveWorker(): IWorkflowExecuteHooks {
const logger = Container.get(Logger);
const workflowStatisticsService = Container.get(WorkflowStatisticsService);
const eventService = Container.get(EventService);
return {
nodeExecuteBefore: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-pre-execute', { executionId, workflow, nodeName });
},
],
nodeExecuteAfter: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-post-execute', { executionId, workflow, nodeName });
},
],
workflowExecuteBefore: [
async function (this: WorkflowHooks): Promise<void> {
const { executionId, workflowData } = this;
eventService.emit('workflow-pre-execute', { executionId, data: workflowData });
},
],
workflowExecuteAfter: [
async function (
this: WorkflowHooks,
fullRunData: IRun,
newStaticData: IDataObject,
): Promise<void> {
logger.debug('Executing hook (hookFunctionsSaveWorker)', {
executionId: this.executionId,
workflowId: this.workflowData.id,
});
const isManualMode = this.mode === 'manual';
try {
if (!isManualMode && isWorkflowIdValid(this.workflowData.id) && newStaticData) {
// Workflow is saved so update in database
try {
await Container.get(WorkflowStaticDataService).saveStaticDataById(
this.workflowData.id,
newStaticData,
);
} catch (e) {
Container.get(ErrorReporter).error(e);
logger.error(
// eslint-disable-next-line @typescript-eslint/no-unsafe-member-access
`There was a problem saving the workflow with id "${this.workflowData.id}" to save changed staticData: "${e.message}" (workflowExecuteAfter)`,
{ pushRef: this.pushRef, workflowId: this.workflowData.id },
);
}
}
const workflowStatusFinal = determineFinalExecutionStatus(fullRunData);
fullRunData.status = workflowStatusFinal;
if (
!isManualMode &&
workflowStatusFinal !== 'success' &&
workflowStatusFinal !== 'waiting'
) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
}
// Although it is treated as IWorkflowBase here, it's being instantiated elsewhere with properties that may be sensitive
// As a result, we should create an IWorkflowBase object with only the data we want to save in it.
const fullExecutionData = prepareExecutionDataForDbUpdate({
runData: fullRunData,
workflowData: this.workflowData,
workflowStatusFinal,
retryOf: this.retryOf,
});
// When going into the waiting state, store the pushRef in the execution-data
if (fullRunData.waitTill && isManualMode) {
fullExecutionData.data.pushRef = this.pushRef;
}
await updateExistingExecution({
executionId: this.executionId,
workflowId: this.workflowData.id,
executionData: fullExecutionData,
});
} catch (error) {
if (!isManualMode) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
}
} finally {
workflowStatisticsService.emit('workflowExecutionCompleted', {
workflowData: this.workflowData,
fullRunData,
});
}
},
async function (this: WorkflowHooks, runData: IRun): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('workflow-post-execute', {
workflow,
executionId,
runData,
});
},
async function (this: WorkflowHooks, fullRunData: IRun) {
const externalHooks = Container.get(ExternalHooks);
if (externalHooks.exists('workflow.postExecute')) {
try {
await externalHooks.run('workflow.postExecute', [
fullRunData,
this.workflowData,
this.executionId,
]);
} catch (error) {
Container.get(ErrorReporter).error(error);
Container.get(Logger).error(
'There was a problem running hook "workflow.postExecute"',
// eslint-disable-next-line @typescript-eslint/no-unsafe-argument
error,
);
}
}
},
],
nodeFetchedData: [
async (workflowId: string, node: INode) => {
workflowStatisticsService.emit('nodeFetchedData', { workflowId, node });
},
],
};
}
/**
* Returns WorkflowHooks instance for running integrated workflows
* (Workflows which get started inside of another workflow)
*/
export function getWorkflowHooksIntegrated(
mode: WorkflowExecuteMode,
executionId: string,
workflowData: IWorkflowBase,
): WorkflowHooks {
const hookFunctions = hookFunctionsSave();
const preExecuteFunctions = hookFunctionsPreExecute();
for (const key of Object.keys(preExecuteFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], preExecuteFunctions[key]);
}
return new WorkflowHooks(hookFunctions, mode, executionId, workflowData);
}
/**
* Returns WorkflowHooks instance for worker in scaling mode.
*/
export function getWorkflowHooksWorkerExecuter(
mode: WorkflowExecuteMode,
executionId: string,
workflowData: IWorkflowBase,
optionalParameters?: IWorkflowHooksOptionalParameters,
): WorkflowHooks {
optionalParameters = optionalParameters || {};
const hookFunctions = hookFunctionsSaveWorker();
const preExecuteFunctions = hookFunctionsPreExecute();
for (const key of Object.keys(preExecuteFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], preExecuteFunctions[key]);
}
if (mode === 'manual' && Container.get(InstanceSettings).isWorker) {
const pushHooks = hookFunctionsPush();
for (const key of Object.keys(pushHooks)) {
if (hookFunctions[key] === undefined) {
hookFunctions[key] = [];
}
// eslint-disable-next-line prefer-spread
hookFunctions[key].push.apply(hookFunctions[key], pushHooks[key]);
}
}
return new WorkflowHooks(hookFunctions, mode, executionId, workflowData, optionalParameters);
}
/**
* Returns WorkflowHooks instance for main process if workflow runs via worker
*/
export function getWorkflowHooksWorkerMain(
mode: WorkflowExecuteMode,
executionId: string,
workflowData: IWorkflowBase,
optionalParameters?: IWorkflowHooksOptionalParameters,
): WorkflowHooks {
optionalParameters = optionalParameters || {};
const hookFunctions = hookFunctionsPreExecute();
// TODO: why are workers pushing to frontend?
// TODO: simplifying this for now to just leave the bare minimum hooks
// const hookFunctions = hookFunctionsPush();
// const preExecuteFunctions = hookFunctionsPreExecute();
// for (const key of Object.keys(preExecuteFunctions)) {
// if (hookFunctions[key] === undefined) {
// hookFunctions[key] = [];
// }
// hookFunctions[key]!.push.apply(hookFunctions[key], preExecuteFunctions[key]);
// }
// When running with worker mode, main process executes
// Only workflowExecuteBefore + workflowExecuteAfter
// So to avoid confusion, we are removing other hooks.
hookFunctions.nodeExecuteBefore = [];
hookFunctions.nodeExecuteAfter = [];
hookFunctions.workflowExecuteAfter = [
async function (this: WorkflowHooks, fullRunData: IRun): Promise<void> {
// Don't delete executions before they are finished
if (!fullRunData.finished) return;
const executionStatus = determineFinalExecutionStatus(fullRunData);
fullRunData.status = executionStatus;
const saveSettings = toSaveSettings(this.workflowData.settings);
const isManualMode = this.mode === 'manual';
if (isManualMode && !saveSettings.manual && !fullRunData.waitTill) {
/**
* When manual executions are not being saved, we only soft-delete
* the execution so that the user can access its binary data
* while building their workflow.
*
* The manual execution and its binary data will be hard-deleted
* on the next pruning cycle after the grace period set by
* `EXECUTIONS_DATA_HARD_DELETE_BUFFER`.
*/
await Container.get(ExecutionRepository).softDelete(this.executionId);
return;
}
const shouldNotSave =
(executionStatus === 'success' && !saveSettings.success) ||
(executionStatus !== 'success' && !saveSettings.error);
if (!isManualMode && shouldNotSave && !fullRunData.waitTill) {
await Container.get(ExecutionRepository).hardDelete({
workflowId: this.workflowData.id,
executionId: this.executionId,
});
}
},
];
return new WorkflowHooks(hookFunctions, mode, executionId, workflowData, optionalParameters);
}
/**
* Returns WorkflowHooks instance for running the main workflow
*/
export function getWorkflowHooksMain(
data: IWorkflowExecutionDataProcess,
executionId: string,
): WorkflowHooks {
const hookFunctions = hookFunctionsSave();
const pushFunctions = hookFunctionsPush();
for (const key of Object.keys(pushFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], pushFunctions[key]);
}
const preExecuteFunctions = hookFunctionsPreExecute();
for (const key of Object.keys(preExecuteFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], preExecuteFunctions[key]);
}
if (!hookFunctions.nodeExecuteBefore) hookFunctions.nodeExecuteBefore = [];
if (!hookFunctions.nodeExecuteAfter) hookFunctions.nodeExecuteAfter = [];
return new WorkflowHooks(hookFunctions, data.executionMode, executionId, data.workflowData, {
pushRef: data.pushRef,
retryOf: data.retryOf as string,
});
}

View file

@ -3,7 +3,8 @@ import { ErrorReporter, Logger } from 'n8n-core';
import type { IRunExecutionData, ITaskData, IWorkflowBase } from 'n8n-workflow';
import { ExecutionRepository } from '@/databases/repositories/execution.repository';
import { toSaveSettings } from '@/execution-lifecycle-hooks/to-save-settings';
import { toSaveSettings } from './to-save-settings';
export async function saveExecutionProgress(
workflowData: IWorkflowBase,

View file

@ -3,6 +3,7 @@ import { stringify } from 'flatted';
import { mock } from 'jest-mock-extended';
import { InstanceSettings } from 'n8n-core';
import { randomInt } from 'n8n-workflow';
import assert from 'node:assert';
import { ARTIFICIAL_TASK_DATA } from '@/constants';
import { ExecutionRepository } from '@/databases/repositories/execution.repository';
@ -127,12 +128,15 @@ describe('ExecutionRecoveryService', () => {
});
describe('if leader, with 1+ messages', () => {
test('should return `null` if execution succeeded', async () => {
test('for successful dataful execution, should return `null`', async () => {
/**
* Arrange
*/
const workflow = await createWorkflow();
const execution = await createExecution({ status: 'success' }, workflow);
const execution = await createExecution(
{ status: 'success', data: stringify({ runData: { foo: 'bar' } }) },
workflow,
);
const messages = setupMessages(execution.id, 'Some workflow');
/**
@ -170,7 +174,38 @@ describe('ExecutionRecoveryService', () => {
expect(amendedExecution).toBeNull();
});
test('should update `status`, `stoppedAt` and `data` if last node did not finish', async () => {
test('for successful dataless execution, should update `status`, `stoppedAt` and `data`', async () => {
/**
* Arrange
*/
const workflow = await createWorkflow();
const execution = await createExecution(
{
status: 'success',
data: stringify(undefined), // saved execution but likely crashed while saving high-volume data
},
workflow,
);
const messages = setupMessages(execution.id, 'Some workflow');
/**
* Act
*/
const amendedExecution = await executionRecoveryService.recoverFromLogs(
execution.id,
messages,
);
/**
* Assert
*/
assert(amendedExecution);
expect(amendedExecution.stoppedAt).not.toBe(execution.stoppedAt);
expect(amendedExecution.data).toEqual({ resultData: { runData: {} } });
expect(amendedExecution.status).toBe('crashed');
});
test('for running execution, should update `status`, `stoppedAt` and `data` if last node did not finish', async () => {
/**
* Arrange
*/

View file

@ -9,9 +9,9 @@ import { ExecutionRepository } from '@/databases/repositories/execution.reposito
import { NodeCrashedError } from '@/errors/node-crashed.error';
import { WorkflowCrashedError } from '@/errors/workflow-crashed.error';
import { EventService } from '@/events/event.service';
import { getWorkflowHooksMain } from '@/execution-lifecycle/execution-lifecycle-hooks';
import type { IExecutionResponse } from '@/interfaces';
import { Push } from '@/push';
import { getWorkflowHooksMain } from '@/workflow-execute-additional-data'; // @TODO: Dependency cycle
import type { EventMessageTypes } from '../eventbus/event-message-classes';
@ -73,7 +73,7 @@ export class ExecutionRecoveryService {
unflattenData: true,
});
if (!execution || execution.status === 'success') return null;
if (!execution || (execution.status === 'success' && execution.data)) return null;
const runExecutionData = execution.data ?? { resultData: { runData: {} } };

View file

@ -43,7 +43,10 @@ export class License {
this.logger = this.logger.scoped('license');
}
async init(forceRecreate = false) {
async init({
forceRecreate = false,
isCli = false,
}: { forceRecreate?: boolean; isCli?: boolean } = {}) {
if (this.manager && !forceRecreate) {
this.logger.warn('License manager already initialized or shutting down');
return;
@ -73,10 +76,13 @@ export class License {
const { isLeader } = this.instanceSettings;
const { autoRenewalEnabled } = this.globalConfig.license;
const eligibleToRenew = isCli || isLeader;
const shouldRenew = isLeader && autoRenewalEnabled;
const shouldRenew = eligibleToRenew && autoRenewalEnabled;
if (isLeader && !autoRenewalEnabled) this.logger.warn(LICENSE_RENEWAL_DISABLED_WARNING);
if (eligibleToRenew && !autoRenewalEnabled) {
this.logger.warn(LICENSE_RENEWAL_DISABLED_WARNING);
}
try {
this.manager = new LicenseManager({
@ -392,7 +398,7 @@ export class License {
async reinit() {
this.manager?.reset();
await this.init(true);
await this.init({ forceRecreate: true });
this.logger.debug('License reinitialized');
}
}

View file

@ -13,6 +13,7 @@ import type PCancelable from 'p-cancelable';
import config from '@/config';
import { ExecutionRepository } from '@/databases/repositories/execution.repository';
import { WorkflowRepository } from '@/databases/repositories/workflow.repository';
import { getWorkflowHooksWorkerExecuter } from '@/execution-lifecycle/execution-lifecycle-hooks';
import { ManualExecutionService } from '@/manual-execution.service';
import { NodeTypes } from '@/node-types';
import * as WorkflowExecuteAdditionalData from '@/workflow-execute-additional-data';
@ -124,7 +125,7 @@ export class JobProcessor {
const { pushRef } = job.data;
additionalData.hooks = WorkflowExecuteAdditionalData.getWorkflowHooksWorkerExecuter(
additionalData.hooks = getWorkflowHooksWorkerExecuter(
execution.mode,
job.data.executionId,
execution.workflowData,

View file

@ -54,6 +54,7 @@ export class TaskRunnerProcess extends TypedEmitter<TaskRunnerProcessEventMap> {
private readonly passthroughEnvVars = [
'PATH',
'HOME', // So home directory can be resolved correctly
'GENERIC_TIMEZONE',
'NODE_FUNCTION_ALLOW_BUILTIN',
'NODE_FUNCTION_ALLOW_EXTERNAL',

View file

@ -100,7 +100,7 @@ export class TaskRunnerServer {
this.server.on('error', (error: Error & { code: string }) => {
if (error.code === 'EADDRINUSE') {
this.logger.info(
`n8n Task Runner's port ${port} is already in use. Do you have another instance of n8n running already?`,
`n8n Task Broker's port ${port} is already in use. Do you have another instance of n8n running already?`,
);
process.exit(1);
}
@ -111,7 +111,7 @@ export class TaskRunnerServer {
this.server.listen(port, address, () => resolve());
});
this.logger.info(`n8n Task Runner server ready on ${address}, port ${port}`);
this.logger.info(`n8n Task Broker ready on ${address}, port ${port}`);
}
/** Creates WebSocket server for handling upgrade requests */

View file

@ -2,7 +2,7 @@ import { mock } from 'jest-mock-extended';
import type { INode } from 'n8n-workflow';
import { NodeOperationError, type Workflow } from 'n8n-workflow';
import { objectToError } from '../workflow-execute-additional-data';
import { objectToError } from '../object-to-error';
describe('objectToError', () => {
describe('node error handling', () => {

View file

@ -0,0 +1,53 @@
import { isObjectLiteral } from 'n8n-core';
import { NodeOperationError } from 'n8n-workflow';
import type { Workflow } from 'n8n-workflow';
export function objectToError(errorObject: unknown, workflow: Workflow): Error {
// TODO: Expand with other error types
if (errorObject instanceof Error) {
// If it's already an Error instance, return it as is.
return errorObject;
} else if (
isObjectLiteral(errorObject) &&
'message' in errorObject &&
typeof errorObject.message === 'string'
) {
// If it's an object with a 'message' property, create a new Error instance.
let error: Error | undefined;
if (
'node' in errorObject &&
isObjectLiteral(errorObject.node) &&
typeof errorObject.node.name === 'string'
) {
const node = workflow.getNode(errorObject.node.name);
if (node) {
error = new NodeOperationError(
node,
errorObject as unknown as Error,
errorObject as object,
);
}
}
if (error === undefined) {
error = new Error(errorObject.message);
}
if ('description' in errorObject) {
// @ts-expect-error Error descriptions are surfaced by the UI but
// not all backend errors account for this property yet.
error.description = errorObject.description as string;
}
if ('stack' in errorObject) {
// If there's a 'stack' property, set it on the new Error instance.
error.stack = errorObject.stack as string;
}
return error;
} else {
// If it's neither an Error nor an object with a 'message' property, create a generic Error.
return new Error('An error occurred');
}
}

View file

@ -5,15 +5,8 @@
import type { PushMessage, PushType } from '@n8n/api-types';
import { GlobalConfig } from '@n8n/config';
import { Container } from '@n8n/di';
import { stringify } from 'flatted';
import {
ErrorReporter,
Logger,
InstanceSettings,
WorkflowExecute,
isObjectLiteral,
} from 'n8n-core';
import { ApplicationError, NodeOperationError, Workflow, WorkflowHooks } from 'n8n-workflow';
import { Logger, WorkflowExecute } from 'n8n-core';
import { ApplicationError, Workflow } from 'n8n-workflow';
import type {
IDataObject,
IExecuteData,
@ -23,11 +16,8 @@ import type {
INodeParameters,
IRun,
IRunExecutionData,
ITaskData,
IWorkflowBase,
IWorkflowExecuteAdditionalData,
IWorkflowExecuteHooks,
IWorkflowHooksOptionalParameters,
IWorkflowSettings,
WorkflowExecuteMode,
ExecutionStatus,
@ -44,633 +34,23 @@ import type {
import { ActiveExecutions } from '@/active-executions';
import { CredentialsHelper } from '@/credentials-helper';
import { ExecutionRepository } from '@/databases/repositories/execution.repository';
import { WorkflowRepository } from '@/databases/repositories/workflow.repository';
import { EventService } from '@/events/event.service';
import type { AiEventMap, AiEventPayload } from '@/events/maps/ai.event-map';
import { getWorkflowHooksIntegrated } from '@/execution-lifecycle/execution-lifecycle-hooks';
import { ExternalHooks } from '@/external-hooks';
import type { IWorkflowErrorData, UpdateExecutionPayload } from '@/interfaces';
import type { UpdateExecutionPayload } from '@/interfaces';
import { NodeTypes } from '@/node-types';
import { Push } from '@/push';
import { WorkflowStatisticsService } from '@/services/workflow-statistics.service';
import { findSubworkflowStart, isWorkflowIdValid } from '@/utils';
import { SecretsHelper } from '@/secrets-helpers.ee';
import { UrlService } from '@/services/url.service';
import { SubworkflowPolicyChecker } from '@/subworkflows/subworkflow-policy-checker.service';
import { TaskRequester } from '@/task-runners/task-managers/task-requester';
import { PermissionChecker } from '@/user-management/permission-checker';
import { findSubworkflowStart } from '@/utils';
import { objectToError } from '@/utils/object-to-error';
import * as WorkflowHelpers from '@/workflow-helpers';
import { WorkflowRepository } from './databases/repositories/workflow.repository';
import { EventService } from './events/event.service';
import { restoreBinaryDataId } from './execution-lifecycle-hooks/restore-binary-data-id';
import { saveExecutionProgress } from './execution-lifecycle-hooks/save-execution-progress';
import {
determineFinalExecutionStatus,
prepareExecutionDataForDbUpdate,
updateExistingExecution,
} from './execution-lifecycle-hooks/shared/shared-hook-functions';
import { toSaveSettings } from './execution-lifecycle-hooks/to-save-settings';
import { SecretsHelper } from './secrets-helpers.ee';
import { OwnershipService } from './services/ownership.service';
import { UrlService } from './services/url.service';
import { SubworkflowPolicyChecker } from './subworkflows/subworkflow-policy-checker.service';
import { TaskRequester } from './task-runners/task-managers/task-requester';
import { PermissionChecker } from './user-management/permission-checker';
import { WorkflowExecutionService } from './workflows/workflow-execution.service';
import { WorkflowStaticDataService } from './workflows/workflow-static-data.service';
export function objectToError(errorObject: unknown, workflow: Workflow): Error {
// TODO: Expand with other error types
if (errorObject instanceof Error) {
// If it's already an Error instance, return it as is.
return errorObject;
} else if (
isObjectLiteral(errorObject) &&
'message' in errorObject &&
typeof errorObject.message === 'string'
) {
// If it's an object with a 'message' property, create a new Error instance.
let error: Error | undefined;
if (
'node' in errorObject &&
isObjectLiteral(errorObject.node) &&
typeof errorObject.node.name === 'string'
) {
const node = workflow.getNode(errorObject.node.name);
if (node) {
error = new NodeOperationError(
node,
errorObject as unknown as Error,
errorObject as object,
);
}
}
if (error === undefined) {
error = new Error(errorObject.message);
}
if ('description' in errorObject) {
// @ts-expect-error Error descriptions are surfaced by the UI but
// not all backend errors account for this property yet.
error.description = errorObject.description as string;
}
if ('stack' in errorObject) {
// If there's a 'stack' property, set it on the new Error instance.
error.stack = errorObject.stack as string;
}
return error;
} else {
// If it's neither an Error nor an object with a 'message' property, create a generic Error.
return new Error('An error occurred');
}
}
/**
* Checks if there was an error and if errorWorkflow or a trigger is defined. If so it collects
* all the data and executes it
*
* @param {IWorkflowBase} workflowData The workflow which got executed
* @param {IRun} fullRunData The run which produced the error
* @param {WorkflowExecuteMode} mode The mode in which the workflow got started in
* @param {string} [executionId] The id the execution got saved as
*/
export function executeErrorWorkflow(
workflowData: IWorkflowBase,
fullRunData: IRun,
mode: WorkflowExecuteMode,
executionId?: string,
retryOf?: string,
): void {
const logger = Container.get(Logger);
// Check if there was an error and if so if an errorWorkflow or a trigger is set
let pastExecutionUrl: string | undefined;
if (executionId !== undefined) {
pastExecutionUrl = `${Container.get(UrlService).getWebhookBaseUrl()}workflow/${
workflowData.id
}/executions/${executionId}`;
}
if (fullRunData.data.resultData.error !== undefined) {
let workflowErrorData: IWorkflowErrorData;
const workflowId = workflowData.id;
if (executionId) {
// The error did happen in an execution
workflowErrorData = {
execution: {
id: executionId,
url: pastExecutionUrl,
error: fullRunData.data.resultData.error,
lastNodeExecuted: fullRunData.data.resultData.lastNodeExecuted!,
mode,
retryOf,
},
workflow: {
id: workflowId,
name: workflowData.name,
},
};
} else {
// The error did happen in a trigger
workflowErrorData = {
trigger: {
error: fullRunData.data.resultData.error,
mode,
},
workflow: {
id: workflowId,
name: workflowData.name,
},
};
}
const { errorTriggerType } = Container.get(GlobalConfig).nodes;
// Run the error workflow
// To avoid an infinite loop do not run the error workflow again if the error-workflow itself failed and it is its own error-workflow.
const { errorWorkflow } = workflowData.settings ?? {};
if (errorWorkflow && !(mode === 'error' && workflowId && errorWorkflow === workflowId)) {
logger.debug('Start external error workflow', {
executionId,
errorWorkflowId: errorWorkflow,
workflowId,
});
// If a specific error workflow is set run only that one
// First, do permission checks.
if (!workflowId) {
// Manual executions do not trigger error workflows
// So this if should never happen. It was added to
// make sure there are no possible security gaps
return;
}
Container.get(OwnershipService)
.getWorkflowProjectCached(workflowId)
.then((project) => {
void Container.get(WorkflowExecutionService).executeErrorWorkflow(
errorWorkflow,
workflowErrorData,
project,
);
})
.catch((error: Error) => {
Container.get(ErrorReporter).error(error);
logger.error(
`Could not execute ErrorWorkflow for execution ID ${this.executionId} because of error querying the workflow owner`,
{
executionId,
errorWorkflowId: errorWorkflow,
workflowId,
error,
workflowErrorData,
},
);
});
} else if (
mode !== 'error' &&
workflowId !== undefined &&
workflowData.nodes.some((node) => node.type === errorTriggerType)
) {
logger.debug('Start internal error workflow', { executionId, workflowId });
void Container.get(OwnershipService)
.getWorkflowProjectCached(workflowId)
.then((project) => {
void Container.get(WorkflowExecutionService).executeErrorWorkflow(
workflowId,
workflowErrorData,
project,
);
});
}
}
}
/**
* Returns hook functions to push data to Editor-UI
*
*/
function hookFunctionsPush(): IWorkflowExecuteHooks {
const logger = Container.get(Logger);
const pushInstance = Container.get(Push);
return {
nodeExecuteBefore: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { pushRef, executionId } = this;
// Push data to session which started workflow before each
// node which starts rendering
if (pushRef === undefined) {
return;
}
logger.debug(`Executing hook on node "${nodeName}" (hookFunctionsPush)`, {
executionId,
pushRef,
workflowId: this.workflowData.id,
});
pushInstance.send({ type: 'nodeExecuteBefore', data: { executionId, nodeName } }, pushRef);
},
],
nodeExecuteAfter: [
async function (this: WorkflowHooks, nodeName: string, data: ITaskData): Promise<void> {
const { pushRef, executionId } = this;
// Push data to session which started workflow after each rendered node
if (pushRef === undefined) {
return;
}
logger.debug(`Executing hook on node "${nodeName}" (hookFunctionsPush)`, {
executionId,
pushRef,
workflowId: this.workflowData.id,
});
pushInstance.send(
{ type: 'nodeExecuteAfter', data: { executionId, nodeName, data } },
pushRef,
);
},
],
workflowExecuteBefore: [
async function (this: WorkflowHooks, _workflow, data): Promise<void> {
const { pushRef, executionId } = this;
const { id: workflowId, name: workflowName } = this.workflowData;
logger.debug('Executing hook (hookFunctionsPush)', {
executionId,
pushRef,
workflowId,
});
// Push data to session which started the workflow
if (pushRef === undefined) {
return;
}
pushInstance.send(
{
type: 'executionStarted',
data: {
executionId,
mode: this.mode,
startedAt: new Date(),
retryOf: this.retryOf,
workflowId,
workflowName,
flattedRunData: data?.resultData.runData
? stringify(data.resultData.runData)
: stringify({}),
},
},
pushRef,
);
},
],
workflowExecuteAfter: [
async function (this: WorkflowHooks, fullRunData: IRun): Promise<void> {
const { pushRef, executionId } = this;
if (pushRef === undefined) return;
const { id: workflowId } = this.workflowData;
logger.debug('Executing hook (hookFunctionsPush)', {
executionId,
pushRef,
workflowId,
});
const { status } = fullRunData;
if (status === 'waiting') {
pushInstance.send({ type: 'executionWaiting', data: { executionId } }, pushRef);
} else {
const rawData = stringify(fullRunData.data);
pushInstance.send(
{ type: 'executionFinished', data: { executionId, workflowId, status, rawData } },
pushRef,
);
}
},
],
};
}
export function hookFunctionsPreExecute(): IWorkflowExecuteHooks {
const externalHooks = Container.get(ExternalHooks);
return {
workflowExecuteBefore: [
async function (this: WorkflowHooks, workflow: Workflow): Promise<void> {
await externalHooks.run('workflow.preExecute', [workflow, this.mode]);
},
],
nodeExecuteAfter: [
async function (
this: WorkflowHooks,
nodeName: string,
data: ITaskData,
executionData: IRunExecutionData,
): Promise<void> {
await saveExecutionProgress(
this.workflowData,
this.executionId,
nodeName,
data,
executionData,
this.pushRef,
);
},
],
};
}
/**
* Returns hook functions to save workflow execution and call error workflow
*
*/
function hookFunctionsSave(): IWorkflowExecuteHooks {
const logger = Container.get(Logger);
const workflowStatisticsService = Container.get(WorkflowStatisticsService);
const eventService = Container.get(EventService);
return {
nodeExecuteBefore: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-pre-execute', { executionId, workflow, nodeName });
},
],
nodeExecuteAfter: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-post-execute', { executionId, workflow, nodeName });
},
],
workflowExecuteBefore: [],
workflowExecuteAfter: [
async function (
this: WorkflowHooks,
fullRunData: IRun,
newStaticData: IDataObject,
): Promise<void> {
logger.debug('Executing hook (hookFunctionsSave)', {
executionId: this.executionId,
workflowId: this.workflowData.id,
});
await restoreBinaryDataId(fullRunData, this.executionId, this.mode);
const isManualMode = this.mode === 'manual';
try {
if (!isManualMode && isWorkflowIdValid(this.workflowData.id) && newStaticData) {
// Workflow is saved so update in database
try {
await Container.get(WorkflowStaticDataService).saveStaticDataById(
this.workflowData.id,
newStaticData,
);
} catch (e) {
Container.get(ErrorReporter).error(e);
logger.error(
`There was a problem saving the workflow with id "${this.workflowData.id}" to save changed staticData: "${e.message}" (hookFunctionsSave)`,
{ executionId: this.executionId, workflowId: this.workflowData.id },
);
}
}
const executionStatus = determineFinalExecutionStatus(fullRunData);
fullRunData.status = executionStatus;
const saveSettings = toSaveSettings(this.workflowData.settings);
if (isManualMode && !saveSettings.manual && !fullRunData.waitTill) {
/**
* When manual executions are not being saved, we only soft-delete
* the execution so that the user can access its binary data
* while building their workflow.
*
* The manual execution and its binary data will be hard-deleted
* on the next pruning cycle after the grace period set by
* `EXECUTIONS_DATA_HARD_DELETE_BUFFER`.
*/
await Container.get(ExecutionRepository).softDelete(this.executionId);
return;
}
const shouldNotSave =
(executionStatus === 'success' && !saveSettings.success) ||
(executionStatus !== 'success' && !saveSettings.error);
if (shouldNotSave && !fullRunData.waitTill && !isManualMode) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
await Container.get(ExecutionRepository).hardDelete({
workflowId: this.workflowData.id,
executionId: this.executionId,
});
return;
}
// Although it is treated as IWorkflowBase here, it's being instantiated elsewhere with properties that may be sensitive
// As a result, we should create an IWorkflowBase object with only the data we want to save in it.
const fullExecutionData = prepareExecutionDataForDbUpdate({
runData: fullRunData,
workflowData: this.workflowData,
workflowStatusFinal: executionStatus,
retryOf: this.retryOf,
});
// When going into the waiting state, store the pushRef in the execution-data
if (fullRunData.waitTill && isManualMode) {
fullExecutionData.data.pushRef = this.pushRef;
}
await updateExistingExecution({
executionId: this.executionId,
workflowId: this.workflowData.id,
executionData: fullExecutionData,
});
if (!isManualMode) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
}
} catch (error) {
Container.get(ErrorReporter).error(error);
logger.error(`Failed saving execution data to DB on execution ID ${this.executionId}`, {
executionId: this.executionId,
workflowId: this.workflowData.id,
error,
});
if (!isManualMode) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
}
} finally {
workflowStatisticsService.emit('workflowExecutionCompleted', {
workflowData: this.workflowData,
fullRunData,
});
}
},
],
nodeFetchedData: [
async (workflowId: string, node: INode) => {
workflowStatisticsService.emit('nodeFetchedData', { workflowId, node });
},
],
};
}
/**
* Returns hook functions to save workflow execution and call error workflow
* for running with queues. Manual executions should never run on queues as
* they are always executed in the main process.
*
*/
function hookFunctionsSaveWorker(): IWorkflowExecuteHooks {
const logger = Container.get(Logger);
const workflowStatisticsService = Container.get(WorkflowStatisticsService);
const eventService = Container.get(EventService);
return {
nodeExecuteBefore: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-pre-execute', { executionId, workflow, nodeName });
},
],
nodeExecuteAfter: [
async function (this: WorkflowHooks, nodeName: string): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('node-post-execute', { executionId, workflow, nodeName });
},
],
workflowExecuteBefore: [
async function (): Promise<void> {
const { executionId, workflowData } = this;
eventService.emit('workflow-pre-execute', { executionId, data: workflowData });
},
],
workflowExecuteAfter: [
async function (
this: WorkflowHooks,
fullRunData: IRun,
newStaticData: IDataObject,
): Promise<void> {
logger.debug('Executing hook (hookFunctionsSaveWorker)', {
executionId: this.executionId,
workflowId: this.workflowData.id,
});
try {
if (isWorkflowIdValid(this.workflowData.id) && newStaticData) {
// Workflow is saved so update in database
try {
await Container.get(WorkflowStaticDataService).saveStaticDataById(
this.workflowData.id,
newStaticData,
);
} catch (e) {
Container.get(ErrorReporter).error(e);
logger.error(
`There was a problem saving the workflow with id "${this.workflowData.id}" to save changed staticData: "${e.message}" (workflowExecuteAfter)`,
{ pushRef: this.pushRef, workflowId: this.workflowData.id },
);
}
}
const workflowStatusFinal = determineFinalExecutionStatus(fullRunData);
fullRunData.status = workflowStatusFinal;
if (workflowStatusFinal !== 'success' && workflowStatusFinal !== 'waiting') {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
}
// Although it is treated as IWorkflowBase here, it's being instantiated elsewhere with properties that may be sensitive
// As a result, we should create an IWorkflowBase object with only the data we want to save in it.
const fullExecutionData = prepareExecutionDataForDbUpdate({
runData: fullRunData,
workflowData: this.workflowData,
workflowStatusFinal,
retryOf: this.retryOf,
});
await updateExistingExecution({
executionId: this.executionId,
workflowId: this.workflowData.id,
executionData: fullExecutionData,
});
} catch (error) {
executeErrorWorkflow(
this.workflowData,
fullRunData,
this.mode,
this.executionId,
this.retryOf,
);
} finally {
workflowStatisticsService.emit('workflowExecutionCompleted', {
workflowData: this.workflowData,
fullRunData,
});
}
},
async function (this: WorkflowHooks, runData: IRun): Promise<void> {
const { executionId, workflowData: workflow } = this;
eventService.emit('workflow-post-execute', {
workflow,
executionId,
runData,
});
},
async function (this: WorkflowHooks, fullRunData: IRun) {
const externalHooks = Container.get(ExternalHooks);
if (externalHooks.exists('workflow.postExecute')) {
try {
await externalHooks.run('workflow.postExecute', [
fullRunData,
this.workflowData,
this.executionId,
]);
} catch (error) {
Container.get(ErrorReporter).error(error);
Container.get(Logger).error(
'There was a problem running hook "workflow.postExecute"',
error,
);
}
}
},
],
nodeFetchedData: [
async (workflowId: string, node: INode) => {
workflowStatisticsService.emit('nodeFetchedData', { workflowId, node });
},
],
};
}
export async function getRunData(
workflowData: IWorkflowBase,
inputData?: INodeExecutionData[],
@ -1061,154 +441,3 @@ export async function getBase(
eventService.emit(eventName, payload),
};
}
/**
* Returns WorkflowHooks instance for running integrated workflows
* (Workflows which get started inside of another workflow)
*/
function getWorkflowHooksIntegrated(
mode: WorkflowExecuteMode,
executionId: string,
workflowData: IWorkflowBase,
): WorkflowHooks {
const hookFunctions = hookFunctionsSave();
const preExecuteFunctions = hookFunctionsPreExecute();
for (const key of Object.keys(preExecuteFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], preExecuteFunctions[key]);
}
return new WorkflowHooks(hookFunctions, mode, executionId, workflowData);
}
/**
* Returns WorkflowHooks instance for worker in scaling mode.
*/
export function getWorkflowHooksWorkerExecuter(
mode: WorkflowExecuteMode,
executionId: string,
workflowData: IWorkflowBase,
optionalParameters?: IWorkflowHooksOptionalParameters,
): WorkflowHooks {
optionalParameters = optionalParameters || {};
const hookFunctions = hookFunctionsSaveWorker();
const preExecuteFunctions = hookFunctionsPreExecute();
for (const key of Object.keys(preExecuteFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], preExecuteFunctions[key]);
}
if (mode === 'manual' && Container.get(InstanceSettings).isWorker) {
const pushHooks = hookFunctionsPush();
for (const key of Object.keys(pushHooks)) {
if (hookFunctions[key] === undefined) {
hookFunctions[key] = [];
}
// eslint-disable-next-line prefer-spread
hookFunctions[key].push.apply(hookFunctions[key], pushHooks[key]);
}
}
return new WorkflowHooks(hookFunctions, mode, executionId, workflowData, optionalParameters);
}
/**
* Returns WorkflowHooks instance for main process if workflow runs via worker
*/
export function getWorkflowHooksWorkerMain(
mode: WorkflowExecuteMode,
executionId: string,
workflowData: IWorkflowBase,
optionalParameters?: IWorkflowHooksOptionalParameters,
): WorkflowHooks {
optionalParameters = optionalParameters || {};
const hookFunctions = hookFunctionsPreExecute();
// TODO: why are workers pushing to frontend?
// TODO: simplifying this for now to just leave the bare minimum hooks
// const hookFunctions = hookFunctionsPush();
// const preExecuteFunctions = hookFunctionsPreExecute();
// for (const key of Object.keys(preExecuteFunctions)) {
// if (hookFunctions[key] === undefined) {
// hookFunctions[key] = [];
// }
// hookFunctions[key]!.push.apply(hookFunctions[key], preExecuteFunctions[key]);
// }
// When running with worker mode, main process executes
// Only workflowExecuteBefore + workflowExecuteAfter
// So to avoid confusion, we are removing other hooks.
hookFunctions.nodeExecuteBefore = [];
hookFunctions.nodeExecuteAfter = [];
hookFunctions.workflowExecuteAfter = [
async function (this: WorkflowHooks, fullRunData: IRun): Promise<void> {
// Don't delete executions before they are finished
if (!fullRunData.finished) return;
const executionStatus = determineFinalExecutionStatus(fullRunData);
fullRunData.status = executionStatus;
const saveSettings = toSaveSettings(this.workflowData.settings);
const isManualMode = this.mode === 'manual';
if (isManualMode && !saveSettings.manual && !fullRunData.waitTill) {
/**
* When manual executions are not being saved, we only soft-delete
* the execution so that the user can access its binary data
* while building their workflow.
*
* The manual execution and its binary data will be hard-deleted
* on the next pruning cycle after the grace period set by
* `EXECUTIONS_DATA_HARD_DELETE_BUFFER`.
*/
await Container.get(ExecutionRepository).softDelete(this.executionId);
return;
}
const shouldNotSave =
(executionStatus === 'success' && !saveSettings.success) ||
(executionStatus !== 'success' && !saveSettings.error);
if (!isManualMode && shouldNotSave && !fullRunData.waitTill) {
await Container.get(ExecutionRepository).hardDelete({
workflowId: this.workflowData.id,
executionId: this.executionId,
});
}
},
];
return new WorkflowHooks(hookFunctions, mode, executionId, workflowData, optionalParameters);
}
/**
* Returns WorkflowHooks instance for running the main workflow
*
*/
export function getWorkflowHooksMain(
data: IWorkflowExecutionDataProcess,
executionId: string,
): WorkflowHooks {
const hookFunctions = hookFunctionsSave();
const pushFunctions = hookFunctionsPush();
for (const key of Object.keys(pushFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], pushFunctions[key]);
}
const preExecuteFunctions = hookFunctionsPreExecute();
for (const key of Object.keys(preExecuteFunctions)) {
const hooks = hookFunctions[key] ?? [];
hooks.push.apply(hookFunctions[key], preExecuteFunctions[key]);
}
if (!hookFunctions.nodeExecuteBefore) hookFunctions.nodeExecuteBefore = [];
if (!hookFunctions.nodeExecuteAfter) hookFunctions.nodeExecuteAfter = [];
return new WorkflowHooks(hookFunctions, data.executionMode, executionId, data.workflowData, {
pushRef: data.pushRef,
retryOf: data.retryOf as string,
});
}

View file

@ -20,7 +20,15 @@ import PCancelable from 'p-cancelable';
import { ActiveExecutions } from '@/active-executions';
import config from '@/config';
import { ExecutionRepository } from '@/databases/repositories/execution.repository';
import { ExecutionNotFoundError } from '@/errors/execution-not-found-error';
import { EventService } from '@/events/event.service';
import {
getWorkflowHooksMain,
getWorkflowHooksWorkerExecuter,
getWorkflowHooksWorkerMain,
} from '@/execution-lifecycle/execution-lifecycle-hooks';
import { ExternalHooks } from '@/external-hooks';
import { ManualExecutionService } from '@/manual-execution.service';
import { NodeTypes } from '@/node-types';
import type { ScalingService } from '@/scaling/scaling.service';
import type { Job, JobData } from '@/scaling/scaling.types';
@ -29,10 +37,6 @@ import * as WorkflowExecuteAdditionalData from '@/workflow-execute-additional-da
import { generateFailedExecutionFromError } from '@/workflow-helpers';
import { WorkflowStaticDataService } from '@/workflows/workflow-static-data.service';
import { ExecutionNotFoundError } from './errors/execution-not-found-error';
import { EventService } from './events/event.service';
import { ManualExecutionService } from './manual-execution.service';
@Service()
export class WorkflowRunner {
private scalingService: ScalingService;
@ -138,7 +142,7 @@ export class WorkflowRunner {
} catch (error) {
// Create a failed execution with the data for the node, save it and abort execution
const runData = generateFailedExecutionFromError(data.executionMode, error, error.node);
const workflowHooks = WorkflowExecuteAdditionalData.getWorkflowHooksMain(data, executionId);
const workflowHooks = getWorkflowHooksMain(data, executionId);
await workflowHooks.executeHookFunctions('workflowExecuteBefore', [
undefined,
data.executionData,
@ -267,7 +271,7 @@ export class WorkflowRunner {
await this.executionRepository.setRunning(executionId); // write
try {
additionalData.hooks = WorkflowExecuteAdditionalData.getWorkflowHooksMain(data, executionId);
additionalData.hooks = getWorkflowHooksMain(data, executionId);
additionalData.hooks.hookFunctions.sendResponse = [
async (response: IExecuteResponsePromiseData): Promise<void> => {
@ -368,12 +372,9 @@ export class WorkflowRunner {
try {
job = await this.scalingService.addJob(jobData, { priority: realtime ? 50 : 100 });
hooks = WorkflowExecuteAdditionalData.getWorkflowHooksWorkerMain(
data.executionMode,
executionId,
data.workflowData,
{ retryOf: data.retryOf ? data.retryOf.toString() : undefined },
);
hooks = getWorkflowHooksWorkerMain(data.executionMode, executionId, data.workflowData, {
retryOf: data.retryOf ? data.retryOf.toString() : undefined,
});
// Normally also workflow should be supplied here but as it only used for sending
// data to editor-UI is not needed.
@ -381,7 +382,7 @@ export class WorkflowRunner {
} catch (error) {
// We use "getWorkflowHooksWorkerExecuter" as "getWorkflowHooksWorkerMain" does not contain the
// "workflowExecuteAfter" which we require.
const hooks = WorkflowExecuteAdditionalData.getWorkflowHooksWorkerExecuter(
const hooks = getWorkflowHooksWorkerExecuter(
data.executionMode,
executionId,
data.workflowData,
@ -399,7 +400,7 @@ export class WorkflowRunner {
// We use "getWorkflowHooksWorkerExecuter" as "getWorkflowHooksWorkerMain" does not contain the
// "workflowExecuteAfter" which we require.
const hooksWorker = WorkflowExecuteAdditionalData.getWorkflowHooksWorkerExecuter(
const hooksWorker = getWorkflowHooksWorkerExecuter(
data.executionMode,
executionId,
data.workflowData,
@ -417,7 +418,7 @@ export class WorkflowRunner {
} catch (error) {
// We use "getWorkflowHooksWorkerExecuter" as "getWorkflowHooksWorkerMain" does not contain the
// "workflowExecuteAfter" which we require.
const hooks = WorkflowExecuteAdditionalData.getWorkflowHooksWorkerExecuter(
const hooks = getWorkflowHooksWorkerExecuter(
data.executionMode,
executionId,
data.workflowData,

View file

@ -371,6 +371,12 @@
</div>
{{/if}}
{{#if isHtml}}
<div class="form-group">
{{{html}}}
</div>
{{/if}}
{{#if isTextarea}}
<div class='form-group'>
<label class='form-label {{inputRequired}}' for='{{id}}'>{{label}}</label>

View file

@ -1,10 +1,85 @@
<html>
<script>
(function messageParent() {
const broadcastChannel = new BroadcastChannel('oauth-callback');
broadcastChannel.postMessage('success');
})();
</script>
<head>
<style>
html {
font-size: 16px;
}
body {
font-family: sans-serif;
}
Got connected. The window can be closed now.
.center-container {
display: flex;
align-items: center;
height: 100vh
}
.left-container {
margin-left: auto;
margin-right: auto;
display: flex;
flex-direction: column;
}
.row {
display: flex;
flex-direction: row;
gap: 20px;
}
.icon {
width: 2.5rem;
fill: #2AA568;
}
.logo {
width: 8rem;
}
h1 {
font-size: 2.5rem;
color: #0F1430;
}
p {
font-size: 1.5rem;
font-weight: 500;
color: #707183;
font-size: 1.1rem;
}
</style>
</head>
<body>
<div class="center-container">
<div class="left-container">
<div class="row">
<img src="{{imagePath}}" class="logo" />
</div>
<div class="row">
<svg
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 512 512"
class="icon"
>
<!--!Font Awesome Free 6.7.2 by @fontawesome - https://fontawesome.com License - https://fontawesome.com/license/free Copyright 2025 Fonticons, Inc.-->
<path
d="M256 512A256 256 0 1 0 256 0a256 256 0 1 0 0 512zM369 209L241 337c-9.4 9.4-24.6 9.4-33.9 0l-64-64c-9.4-9.4-9.4-24.6 0-33.9s24.6-9.4 33.9 0l47 47L335 175c9.4-9.4 24.6-9.4 33.9 0s9.4 24.6 0 33.9z"
/>
</svg>
<h1>Connection successful</h1>
</div>
<div class="row">
<p>This window will close automatically in 5 seconds.</p>
</div>
</div>
</div>
<script type="text/javascript">
(function messageParent() {
const broadcastChannel = new BroadcastChannel('oauth-callback');
broadcastChannel.postMessage('success');
})();
(function autoclose(){
setTimeout(function() { window.close(); }, 5000);
})();
</script>
</body>
</html>

View file

@ -116,7 +116,7 @@ describe('OAuth2 API', () => {
.query({ code: 'auth_code', state })
.expect(200);
expect(renderSpy).toHaveBeenCalledWith('oauth-callback');
expect(renderSpy).toHaveBeenCalledWith('oauth-callback', { imagePath: 'n8n-logo.png' });
const updatedCredential = await Container.get(CredentialsHelper).getCredentials(
credential,

View file

@ -1,6 +1,6 @@
{
"name": "n8n-core",
"version": "1.75.0",
"version": "1.76.0",
"description": "Core functionality of n8n",
"main": "dist/index",
"types": "dist/index.d.ts",

View file

@ -1,3 +1,7 @@
// Disable task runners until we have fixed the "run test workflows" test
// to mock the Code Node execution
process.env.N8N_RUNNERS_ENABLED = 'false';
// NOTE: Diagrams in this file have been created with https://asciiflow.com/#/
// If you update the tests, please update the diagrams as well.
// If you add a test, please create a new diagram.

View file

@ -1,6 +1,6 @@
{
"name": "n8n-design-system",
"version": "1.65.0",
"version": "1.66.0",
"main": "src/main.ts",
"import": "src/main.ts",
"scripts": {

View file

@ -1,6 +1,6 @@
{
"name": "n8n-editor-ui",
"version": "1.75.0",
"version": "1.76.0",
"description": "Workflow Editor UI for n8n",
"main": "index.js",
"scripts": {

View file

@ -1,6 +1,9 @@
import { useAIAssistantHelpers } from '@/composables/useAIAssistantHelpers';
import { AI_ASSISTANT_MAX_CONTENT_LENGTH } from '@/constants';
import type { ICredentialsResponse, IRestApiContext } from '@/Interface';
import type { AskAiRequest, ChatRequest, ReplaceCodeRequest } from '@/types/assistant.types';
import { makeRestApiRequest, streamRequest } from '@/utils/apiUtils';
import { getObjectSizeInKB } from '@/utils/objectUtils';
import type { IDataObject } from 'n8n-workflow';
export function chatWithAssistant(
@ -10,6 +13,15 @@ export function chatWithAssistant(
onDone: () => void,
onError: (e: Error) => void,
): void {
try {
const payloadSize = getObjectSizeInKB(payload.payload);
if (payloadSize > AI_ASSISTANT_MAX_CONTENT_LENGTH) {
useAIAssistantHelpers().trimPayloadSize(payload);
}
} catch (e) {
onError(e);
return;
}
void streamRequest<ChatRequest.ResponsePayload>(
ctx,
'/ai/chat',

View file

@ -230,28 +230,28 @@ describe('CanvasChat', () => {
// Verify workflow execution
expect(workflowsStore.runWorkflow).toHaveBeenCalledWith(
expect.objectContaining({
runData: {
'When chat message received': [
{
data: {
main: [
[
{
json: {
action: 'sendMessage',
chatInput: 'Hello AI!',
sessionId: expect.any(String),
},
runData: undefined,
triggerToStartFrom: {
name: 'When chat message received',
data: {
data: {
main: [
[
{
json: {
action: 'sendMessage',
chatInput: 'Hello AI!',
sessionId: expect.any(String),
},
],
},
],
},
executionStatus: 'success',
executionTime: 0,
source: [null],
startTime: expect.any(Number),
],
},
],
executionStatus: 'success',
executionTime: 0,
source: [null],
startTime: expect.any(Number),
},
},
}),
);

View file

@ -1,6 +1,8 @@
import { createTestingPinia } from '@pinia/testing';
import JsonEditor from '@/components/JsonEditor/JsonEditor.vue';
import { renderComponent } from '@/__tests__/render';
import { waitFor } from '@testing-library/vue';
import { userEvent } from '@testing-library/user-event';
describe('JsonEditor', () => {
const renderEditor = (jsonString: string) =>
@ -13,18 +15,29 @@ describe('JsonEditor', () => {
it('renders simple json', async () => {
const modelValue = '{ "testing": [true, 5] }';
const result = renderEditor(modelValue);
expect(result.container.querySelector('.cm-content')?.textContent).toEqual(modelValue);
const { getByRole } = renderEditor(modelValue);
expect(getByRole('textbox').textContent).toEqual(modelValue);
});
it('renders multiline json', async () => {
const modelValue = '{\n\t"testing": [true, 5]\n}';
const result = renderEditor(modelValue);
const gutter = result.container.querySelector('.cm-gutters');
const { getByRole, container } = renderEditor(modelValue);
const gutter = container.querySelector('.cm-gutters');
expect(gutter?.querySelectorAll('.cm-lineNumbers .cm-gutterElement').length).toEqual(4);
const content = result.container.querySelector('.cm-content');
const lines = [...content!.querySelectorAll('.cm-line').values()].map((l) => l.textContent);
const content = getByRole('textbox');
const lines = [...content.querySelectorAll('.cm-line').values()].map((l) => l.textContent);
expect(lines).toEqual(['{', '\t"testing": [true, 5]', '}']);
});
it('emits update:model-value events', async () => {
const modelValue = '{ "test": 1 }';
const { emitted, getByRole } = renderEditor(modelValue);
const textbox = await waitFor(() => getByRole('textbox'));
await userEvent.type(textbox, 'test');
await waitFor(() => expect(emitted('update:modelValue')).toContainEqual(['test{ "test": 1 }']));
});
});

View file

@ -36,7 +36,6 @@ const emit = defineEmits<{
const jsonEditorRef = ref<HTMLDivElement>();
const editor = ref<EditorView | null>(null);
const editorState = ref<EditorState | null>(null);
const isDirty = ref(false);
const extensions = computed(() => {
const extensionsToApply: Extension[] = [
@ -66,7 +65,6 @@ const extensions = computed(() => {
bracketMatching(),
mappingDropCursor(),
EditorView.updateListener.of((viewUpdate: ViewUpdate) => {
isDirty.value = true;
if (!viewUpdate.docChanged || !editor.value) return;
emit('update:modelValue', editor.value?.state.doc.toString());
}),
@ -81,7 +79,6 @@ onMounted(() => {
onBeforeUnmount(() => {
if (!editor.value) return;
if (isDirty.value) emit('update:modelValue', editor.value.state.doc.toString());
editor.value.destroy();
});

View file

@ -87,6 +87,41 @@ describe('ProjectMoveResourceModal', () => {
expect(getByText(/Currently there are not any projects or users available/)).toBeVisible();
});
it('should not hide project select if filter has no result', async () => {
const projects = Array.from({ length: 5 }, createProjectListItem);
projectsStore.availableProjects = projects;
const props = {
modalName: PROJECT_MOVE_RESOURCE_MODAL,
data: {
resourceType: 'workflow',
resourceTypeLabel: 'Workflow',
resource: {
id: '1',
homeProject: {
id: projects[0].id,
name: projects[0].name,
},
},
},
};
const { getByTestId, getByRole } = renderComponent({ props });
const projectSelect = getByTestId('project-move-resource-modal-select');
const projectSelectInput: HTMLInputElement = getByRole('combobox');
expect(projectSelectInput).toBeVisible();
expect(projectSelect).toBeVisible();
const projectSelectDropdownItems = await getDropdownItems(projectSelect);
expect(projectSelectDropdownItems).toHaveLength(projects.length - 1);
await userEvent.click(projectSelectInput);
await userEvent.type(projectSelectInput, 'non-existing project');
expect(projectSelect).toBeVisible();
});
it('should not load workflow if the resource is a credential', async () => {
const telemetryTrackSpy = vi.spyOn(telemetry, 'track');
projectsStore.availableProjects = [createProjectListItem()];

View file

@ -47,12 +47,14 @@ const availableProjects = computed(() =>
'name',
projectsStore.availableProjects.filter(
(p) =>
p.name?.toLowerCase().includes(filter.value.toLowerCase()) &&
p.id !== props.data.resource.homeProject?.id &&
(!p.scopes || getResourcePermissions(p.scopes)[props.data.resourceType].create),
),
),
);
const filteredProjects = computed(() =>
availableProjects.value.filter((p) => p.name?.toLowerCase().includes(filter.value.toLowerCase())),
);
const selectedProject = computed(() =>
availableProjects.value.find((p) => p.id === projectId.value),
);
@ -217,7 +219,7 @@ onMounted(async () => {
<N8nIcon icon="search" />
</template>
<N8nOption
v-for="p in availableProjects"
v-for="p in filteredProjects"
:key="p.id"
:value="p.id"
:label="p.name"

View file

@ -1,3 +1,3 @@
// Vitest Snapshot v1, https://vitest.dev/guide/snapshot.html
exports[`useBugReporting > should generate the correct reporting URL 1`] = `"https://github.com/n8n-io/n8n/issues/new?labels=bug-report&body=%0A%3C%21--+Please+follow+the+template+below.+Skip+the+questions+that+are+not+relevant+to+you.+--%3E%0A%0A%23%23+Describe+the+problem%2Ferror%2Fquestion%0A%0A%0A%23%23+What+is+the+error+message+%28if+any%29%3F%0A%0A%0A%23%23+Please+share+your+workflow%2Fscreenshots%2Frecording%0A%0A%60%60%60%0A%28Select+the+nodes+on+your+canvas+and+use+the+keyboard+shortcuts+CMD%2BC%2FCTRL%2BC+and+CMD%2BV%2FCTRL%2BV+to+copy+and+paste+the+workflow.%29%0A%60%60%60%0A%0A%0A%23%23+Share+the+output+returned+by+the+last+node%0A%3C%21--+If+you+need+help+with+data+transformations%2C+please+also+share+your+expected+output.+--%3E%0A%0A%0Amocked+debug+info%7D"`;
exports[`useBugReporting > should generate the correct reporting URL 1`] = `"https://github.com/n8n-io/n8n/issues/new?labels=bug-report&body=%0A%3C%21--+Please+follow+the+template+below.+Skip+the+questions+that+are+not+relevant+to+you.+--%3E%0A%0A%23%23+Describe+the+problem%2Ferror%2Fquestion%0A%0A%0A%23%23+What+is+the+error+message+%28if+any%29%3F%0A%0A%0A%23%23+Please+share+your+workflow%2Fscreenshots%2Frecording%0A%0A%60%60%60%0A%28Select+the+nodes+on+your+canvas+and+use+the+keyboard+shortcuts+CMD%2BC%2FCTRL%2BC+and+CMD%2BV%2FCTRL%2BV+to+copy+and+paste+the+workflow.%29%0A%E2%9A%A0%EF%B8%8F+WARNING+%E2%9A%A0%EF%B8%8F+If+you+have+sensitive+data+in+your+workflow+%28like+API+keys%29%2C+please+remove+it+before+sharing.%0A%60%60%60%0A%0A%0A%23%23+Share+the+output+returned+by+the+last+node%0A%3C%21--+If+you+need+help+with+data+transformations%2C+please+also+share+your+expected+output.+--%3E%0A%0A%0Amocked+debug+info%7D"`;

View file

@ -0,0 +1,430 @@
import { VIEWS } from '@/constants';
import type { ChatRequest } from '@/types/assistant.types';
import { NodeConnectionType } from 'n8n-workflow';
export const PAYLOAD_SIZE_FOR_1_PASS = 4;
export const PAYLOAD_SIZE_FOR_2_PASSES = 2;
export const ERROR_HELPER_TEST_PAYLOAD: ChatRequest.RequestPayload = {
payload: {
role: 'user',
type: 'init-error-helper',
user: {
firstName: 'Milorad',
},
error: {
name: 'NodeOperationError',
message: "Referenced node doesn't exist",
description:
"The node <strong>'Hey'</strong> doesn't exist, but it's used in an expression here.",
},
node: {
position: [0, 0],
parameters: {
mode: 'manual',
duplicateItem: false,
assignments: {
assignments: {
'0': {
id: '0957fbdb-a021-413b-9d42-fc847666f999',
name: 'text',
value: 'Lorem ipsum dolor sit amet',
type: 'string',
},
'1': {
id: '8efecfa7-8df7-492e-83e7-3d517ad03e60',
name: 'foo',
value: {
value: "={{ $('Hey').json.name }}",
resolvedExpressionValue: 'Error in expression: "Referenced node doesn\'t exist"',
},
type: 'string',
},
},
},
includeOtherFields: false,
options: {},
},
type: 'n8n-nodes-base.set',
typeVersion: 3.4,
id: '6dc70bf3-ba54-4481-b9f5-ce255bdd5fb8',
name: 'This is fine',
},
executionSchema: [],
},
};
export const SUPPORT_CHAT_TEST_PAYLOAD: ChatRequest.RequestPayload = {
payload: {
role: 'user',
type: 'init-support-chat',
user: {
firstName: 'Milorad',
},
context: {
currentView: {
name: VIEWS.WORKFLOW,
description:
'The user is currently looking at the current workflow in n8n editor, without any specific node selected.',
},
activeNodeInfo: {
node: {
position: [0, 0],
parameters: {
mode: 'manual',
duplicateItem: false,
assignments: {
assignments: {
'0': {
id: '969e86d0-76de-44f6-b07d-44a8a953f564',
name: 'name',
value: {
value: "={{ $('Edit Fields 2').name }}",
resolvedExpressionValue:
'Error in expression: "Referenced node doesn\'t exist"',
},
type: 'number',
},
},
},
includeOtherFields: false,
options: {},
},
type: 'n8n-nodes-base.set',
typeVersion: 3.4,
id: '8eac1591-ddc6-4d93-bec7-998cbfe27cc7',
name: 'Edit Fields1',
},
executionStatus: {
status: 'error',
error: {
name: 'NodeOperationError',
message: "Referenced node doesn't exist",
stack:
"NodeOperationError: Referenced node doesn't exist\n at ExecuteContext.execute (/Users/miloradfilipovic/workspace/n8n/packages/nodes-base/nodes/Set/v2/manual.mode.ts:256:9)\n at ExecuteContext.execute (/Users/miloradfilipovic/workspace/n8n/packages/nodes-base/nodes/Set/v2/SetV2.node.ts:351:48)\n at WorkflowExecute.runNode (/Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:1097:31)\n at /Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:1505:38\n at /Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:2066:11",
},
},
referencedNodes: [],
},
currentWorkflow: {
name: '🧪 Assistant context test',
active: false,
connections: {
'When clicking Test workflow': {
main: [
[
{
node: 'Edit Fields',
type: NodeConnectionType.Main,
index: 0,
},
],
],
},
'Edit Fields': {
main: [
[
{
node: 'Bad request no chat found',
type: NodeConnectionType.Main,
index: 0,
},
{
node: 'Slack',
type: NodeConnectionType.Main,
index: 0,
},
{
node: 'Edit Fields1',
type: NodeConnectionType.Main,
index: 0,
},
{
node: 'Edit Fields2',
type: NodeConnectionType.Main,
index: 0,
},
],
],
},
},
nodes: [
{
parameters: {
notice: '',
},
id: 'c457ff96-3b0c-4dbc-b47f-dc88396a46ae',
name: 'When clicking Test workflow',
type: 'n8n-nodes-base.manualTrigger',
position: [-60, 200],
typeVersion: 1,
},
{
parameters: {
resource: 'chat',
operation: 'get',
chatId: '13',
},
id: '60ddc045-d4e3-4b62-9832-12ecf78937a6',
name: 'Bad request no chat found',
type: 'n8n-nodes-base.telegram',
typeVersion: 1.1,
position: [540, 0],
issues: {},
disabled: true,
},
{
parameters: {
mode: 'manual',
duplicateItem: false,
assignments: {
assignments: [
{
id: '70448b12-9b2b-4bfb-abee-6432c4c58de1',
name: 'name',
value: 'Joe',
type: 'string',
},
],
},
includeOtherFields: false,
options: {},
},
type: 'n8n-nodes-base.set',
typeVersion: 3.4,
position: [200, 200],
id: '0a831739-13cd-4541-b20b-7db73abbcaf0',
name: 'Edit Fields',
},
{
parameters: {
authentication: 'oAuth2',
resource: 'channel',
operation: 'archive',
channelId: {
__rl: true,
mode: 'list',
value: '',
},
},
type: 'n8n-nodes-base.slack',
typeVersion: 2.2,
position: [540, 200],
id: 'aff7471e-b2bc-4274-abe1-97897a17eaa6',
name: 'Slack',
webhookId: '7f8b574c-7729-4220-bbe9-bf5aa382406a',
credentials: {
slackOAuth2Api: {
id: 'mZRj4wi3gavIzu9b',
name: 'Slack account',
},
},
disabled: true,
},
{
parameters: {
mode: 'manual',
duplicateItem: false,
assignments: {
assignments: [
{
id: '969e86d0-76de-44f6-b07d-44a8a953f564',
name: 'name',
value: "={{ $('Edit Fields 2').name }}",
type: 'number',
},
],
},
includeOtherFields: false,
options: {},
},
type: 'n8n-nodes-base.set',
typeVersion: 3.4,
position: [540, 400],
id: '8eac1591-ddc6-4d93-bec7-998cbfe27cc7',
name: 'Edit Fields1',
issues: {
execution: true,
},
},
{
parameters: {
mode: 'manual',
duplicateItem: false,
assignments: {
assignments: [
{
id: '9bdfc283-64f7-41c5-9a55-b8d8ccbe3e9d',
name: 'age',
value: '={{ $json.name }}',
type: 'number',
},
],
},
includeOtherFields: false,
options: {},
},
type: 'n8n-nodes-base.set',
typeVersion: 3.4,
position: [440, 560],
id: '34e56e14-d1a9-4a73-9208-15d39771a9ba',
name: 'Edit Fields2',
},
],
},
executionData: {
runData: {
'When clicking Test workflow': [
{
hints: [],
startTime: 1737540693122,
executionTime: 1,
source: [],
executionStatus: 'success',
},
],
'Edit Fields': [
{
hints: [],
startTime: 1737540693124,
executionTime: 2,
source: [
{
previousNode: 'When clicking Test workflow',
},
],
executionStatus: 'success',
},
],
'Bad request no chat found': [
{
hints: [],
startTime: 1737540693126,
executionTime: 0,
source: [
{
previousNode: 'Edit Fields',
},
],
executionStatus: 'success',
},
],
Slack: [
{
hints: [],
startTime: 1737540693127,
executionTime: 0,
source: [
{
previousNode: 'Edit Fields',
},
],
executionStatus: 'success',
},
],
'Edit Fields1': [
{
hints: [],
startTime: 1737540693127,
executionTime: 28,
source: [
{
previousNode: 'Edit Fields',
},
],
executionStatus: 'error',
// @ts-expect-error Incomplete mock objects are expected
error: {
level: 'warning',
tags: {
packageName: 'workflow',
},
context: {
itemIndex: 0,
nodeCause: 'Edit Fields 2',
descriptionKey: 'nodeNotFound',
parameter: 'assignments',
},
functionality: 'regular',
name: 'NodeOperationError',
timestamp: 1737540693141,
node: {
parameters: {
mode: 'manual',
duplicateItem: false,
assignments: {
assignments: [
{
id: '969e86d0-76de-44f6-b07d-44a8a953f564',
name: 'name',
value: "={{ $('Edit Fields 2').name }}",
type: 'number',
},
],
},
includeOtherFields: false,
options: {},
},
type: 'n8n-nodes-base.set',
typeVersion: 3.4,
position: [540, 400],
id: '8eac1591-ddc6-4d93-bec7-998cbfe27cc7',
name: 'Edit Fields1',
},
messages: [],
message: "Referenced node doesn't exist",
stack:
"NodeOperationError: Referenced node doesn't exist\n at ExecuteContext.execute (/Users/miloradfilipovic/workspace/n8n/packages/nodes-base/nodes/Set/v2/manual.mode.ts:256:9)\n at ExecuteContext.execute (/Users/miloradfilipovic/workspace/n8n/packages/nodes-base/nodes/Set/v2/SetV2.node.ts:351:48)\n at WorkflowExecute.runNode (/Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:1097:31)\n at /Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:1505:38\n at /Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:2066:11",
},
},
],
},
// @ts-expect-error Incomplete mock objects are expected
error: {
level: 'warning',
tags: {
packageName: 'workflow',
},
context: {
itemIndex: 0,
nodeCause: 'Edit Fields 2',
descriptionKey: 'nodeNotFound',
parameter: 'assignments',
},
functionality: 'regular',
name: 'NodeOperationError',
timestamp: 1737540693141,
node: {
parameters: {
mode: 'manual',
duplicateItem: false,
assignments: {
assignments: [
{
id: '969e86d0-76de-44f6-b07d-44a8a953f564',
name: 'name',
value: "={{ $('Edit Fields 2').name }}",
type: 'number',
},
],
},
includeOtherFields: false,
options: {},
},
type: 'n8n-nodes-base.set',
typeVersion: 3.4,
position: [540, 400],
id: '8eac1591-ddc6-4d93-bec7-998cbfe27cc7',
name: 'Edit Fields1',
},
messages: [],
message: "Referenced node doesn't exist",
stack:
"NodeOperationError: Referenced node doesn't exist\n at ExecuteContext.execute (/Users/miloradfilipovic/workspace/n8n/packages/nodes-base/nodes/Set/v2/manual.mode.ts:256:9)\n at ExecuteContext.execute (/Users/miloradfilipovic/workspace/n8n/packages/nodes-base/nodes/Set/v2/SetV2.node.ts:351:48)\n at WorkflowExecute.runNode (/Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:1097:31)\n at /Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:1505:38\n at /Users/miloradfilipovic/workspace/n8n/packages/core/src/execution-engine/workflow-execute.ts:2066:11",
},
lastNodeExecuted: 'Edit Fields1',
},
},
question: 'Hey',
},
};

View file

@ -4,6 +4,13 @@ import { useAIAssistantHelpers } from './useAIAssistantHelpers';
import { createTestingPinia } from '@pinia/testing';
import { setActivePinia } from 'pinia';
import type { IWorkflowDb } from '@/Interface';
import type { ChatRequest } from '@/types/assistant.types';
import {
ERROR_HELPER_TEST_PAYLOAD,
PAYLOAD_SIZE_FOR_1_PASS,
PAYLOAD_SIZE_FOR_2_PASSES,
SUPPORT_CHAT_TEST_PAYLOAD,
} from './useAIAssistantHelpers.test.constants';
const referencedNodesTestCases: Array<{ caseName: string; node: INode; expected: string[] }> = [
{
@ -549,3 +556,67 @@ describe('Simplify assistant payloads', () => {
}
});
});
describe('Trim Payload Size', () => {
let aiAssistantHelpers: ReturnType<typeof useAIAssistantHelpers>;
beforeEach(() => {
setActivePinia(createTestingPinia());
aiAssistantHelpers = useAIAssistantHelpers();
});
it('Should trim active node parameters in error helper payload', () => {
const payload = ERROR_HELPER_TEST_PAYLOAD;
aiAssistantHelpers.trimPayloadSize(payload);
expect((payload.payload as ChatRequest.InitErrorHelper).node.parameters).toEqual({});
});
it('Should trim all node parameters in support chat', () => {
// Testing the scenario where only one trimming pass is needed
// (payload is under the limit after removing all node parameters and execution data)
const payload: ChatRequest.RequestPayload = SUPPORT_CHAT_TEST_PAYLOAD;
const supportPayload: ChatRequest.InitSupportChat =
payload.payload as ChatRequest.InitSupportChat;
// Trimming to 4kb should be successful
expect(() =>
aiAssistantHelpers.trimPayloadSize(payload, PAYLOAD_SIZE_FOR_1_PASS),
).not.toThrow();
// All active node parameters should be removed
expect(supportPayload?.context?.activeNodeInfo?.node?.parameters).toEqual({});
// Also, all node parameters in the workflow should be removed
supportPayload.context?.currentWorkflow?.nodes?.forEach((node) => {
expect(node.parameters).toEqual({});
});
// Node parameters in the execution data should be removed
expect(supportPayload.context?.executionData?.runData).toEqual({});
if (
supportPayload.context?.executionData?.error &&
'node' in supportPayload.context.executionData.error
) {
expect(supportPayload.context?.executionData?.error?.node?.parameters).toEqual({});
}
// Context object should still be there
expect(supportPayload.context).to.be.an('object');
});
it('Should trim the whole context in support chat', () => {
// Testing the scenario where both trimming passes are needed
// (payload is over the limit after removing all node parameters and execution data)
const payload: ChatRequest.RequestPayload = SUPPORT_CHAT_TEST_PAYLOAD;
const supportPayload: ChatRequest.InitSupportChat =
payload.payload as ChatRequest.InitSupportChat;
// Trimming should be successful
expect(() =>
aiAssistantHelpers.trimPayloadSize(payload, PAYLOAD_SIZE_FOR_2_PASSES),
).not.toThrow();
// The whole context object should be removed
expect(supportPayload.context).not.toBeDefined();
});
it('Should throw an error if payload is too big after trimming', () => {
const payload = ERROR_HELPER_TEST_PAYLOAD;
expect(() => aiAssistantHelpers.trimPayloadSize(payload, 0.2)).toThrow();
});
});

View file

@ -14,9 +14,10 @@ import { executionDataToJson, getMainAuthField, getNodeAuthOptions } from '@/uti
import type { ChatRequest } from '@/types/assistant.types';
import { useWorkflowsStore } from '@/stores/workflows.store';
import { useDataSchema } from './useDataSchema';
import { VIEWS } from '@/constants';
import { AI_ASSISTANT_MAX_CONTENT_LENGTH, VIEWS } from '@/constants';
import { useI18n } from './useI18n';
import type { IWorkflowDb } from '@/Interface';
import { getObjectSizeInKB } from '@/utils/objectUtils';
const CANVAS_VIEWS = [VIEWS.NEW_WORKFLOW, VIEWS.WORKFLOW, VIEWS.EXECUTION_DEBUG];
const EXECUTION_VIEWS = [VIEWS.EXECUTION_PREVIEW];
@ -251,6 +252,64 @@ export const useAIAssistantHelpers = () => {
nodes: workflow.nodes,
});
/**
* Reduces AI Assistant request payload size to make it fit the specified content length.
* If, after two passes, the payload is still too big, throws an error'
* @param payload The request payload to trim
* @param size The maximum size of the payload in KB
*/
const trimPayloadToSize = (
payload: ChatRequest.RequestPayload,
size = AI_ASSISTANT_MAX_CONTENT_LENGTH,
): void => {
const requestPayload = payload.payload;
// For support chat, remove parameters from the active node object and all nodes in the workflow
if (requestPayload.type === 'init-support-chat') {
if (requestPayload.context?.activeNodeInfo?.node) {
requestPayload.context.activeNodeInfo.node.parameters = {};
}
if (requestPayload.context?.currentWorkflow) {
requestPayload.context.currentWorkflow?.nodes?.forEach((node) => {
node.parameters = {};
});
}
if (requestPayload.context?.executionData?.runData) {
requestPayload.context.executionData.runData = {};
}
if (
requestPayload.context?.executionData?.error &&
'node' in requestPayload.context?.executionData?.error
) {
if (requestPayload.context?.executionData?.error?.node) {
requestPayload.context.executionData.error.node.parameters = {};
}
}
// If the payload is still too big, remove the whole context object
if (getRequestPayloadSize(payload) > size) {
requestPayload.context = undefined;
}
// For error helper, remove parameters from the active node object
// This will leave just the error, user info and basic node structure in the payload
} else if (requestPayload.type === 'init-error-helper') {
requestPayload.node.parameters = {};
}
// If the payload is still too big, throw an error that will be shown to the user
if (getRequestPayloadSize(payload) > size) {
throw new Error(locale.baseText('aiAssistant.payloadTooBig.message'));
}
};
/**
* Get the size of the request payload in KB, returns 0 if the payload is not a valid object
*/
const getRequestPayloadSize = (payload: ChatRequest.RequestPayload): number => {
try {
return getObjectSizeInKB(payload.payload);
} catch (error) {
return 0;
}
};
return {
processNodeForAssistant,
getNodeInfoForAssistant,
@ -261,5 +320,6 @@ export const useAIAssistantHelpers = () => {
getReferencedNodes,
simplifyResultData,
simplifyWorkflowForAssistant,
trimPayloadSize: trimPayloadToSize,
};
};

View file

@ -15,6 +15,7 @@ const REPORT_TEMPLATE = `
\`\`\`
(Select the nodes on your canvas and use the keyboard shortcuts CMD+C/CTRL+C and CMD+V/CTRL+V to copy and paste the workflow.)
WARNING If you have sensitive data in your workflow (like API keys), please remove it before sharing.
\`\`\`

View file

@ -21,7 +21,7 @@ import { useToast } from './useToast';
import { useI18n } from '@/composables/useI18n';
import { useLocalStorage } from '@vueuse/core';
import { ref } from 'vue';
import { mock } from 'vitest-mock-extended';
import { captor, mock } from 'vitest-mock-extended';
vi.mock('@/stores/workflows.store', () => ({
useWorkflowsStore: vi.fn().mockReturnValue({
@ -409,27 +409,28 @@ describe('useRunWorkflow({ router })', () => {
const mockExecutionResponse = { executionId: '123' };
const mockRunData = { nodeName: [] };
const { runWorkflow } = useRunWorkflow({ router });
const dataCaptor = captor();
const workflow = mock<Workflow>({ name: 'Test Workflow' });
workflow.getParentNodes.mockReturnValue([]);
vi.mocked(useLocalStorage).mockReturnValueOnce(ref(0));
vi.mocked(rootStore).pushConnectionActive = true;
vi.mocked(workflowsStore).runWorkflow.mockResolvedValue(mockExecutionResponse);
vi.mocked(workflowsStore).nodesIssuesExist = false;
vi.mocked(workflowHelpers).getCurrentWorkflow.mockReturnValue({
name: 'Test Workflow',
} as Workflow);
vi.mocked(workflowHelpers).getWorkflowDataToSave.mockResolvedValue({
id: 'workflowId',
nodes: [],
} as unknown as IWorkflowData);
vi.mocked(workflowHelpers).getCurrentWorkflow.mockReturnValue(workflow);
vi.mocked(workflowHelpers).getWorkflowDataToSave.mockResolvedValue(
mock<IWorkflowData>({ id: 'workflowId', nodes: [] }),
);
vi.mocked(workflowsStore).getWorkflowRunData = mockRunData;
// ACT
const result = await runWorkflow({});
const result = await runWorkflow({ destinationNode: 'some node name' });
// ASSERT
expect(result).toEqual(mockExecutionResponse);
expect(workflowsStore.setWorkflowExecutionData).toHaveBeenCalledTimes(1);
expect(vi.mocked(workflowsStore.setWorkflowExecutionData).mock.calls[0][0]).toMatchObject({
expect(workflowsStore.setWorkflowExecutionData).toHaveBeenCalledWith(dataCaptor);
expect(dataCaptor.value).toMatchObject({
data: { resultData: { runData: {} } },
});
});
@ -439,18 +440,47 @@ describe('useRunWorkflow({ router })', () => {
const mockExecutionResponse = { executionId: '123' };
const mockRunData = { nodeName: [] };
const { runWorkflow } = useRunWorkflow({ router });
const dataCaptor = captor();
const workflow = mock<Workflow>({ name: 'Test Workflow' });
workflow.getParentNodes.mockReturnValue([]);
vi.mocked(useLocalStorage).mockReturnValueOnce(ref(1));
vi.mocked(rootStore).pushConnectionActive = true;
vi.mocked(workflowsStore).runWorkflow.mockResolvedValue(mockExecutionResponse);
vi.mocked(workflowsStore).nodesIssuesExist = false;
vi.mocked(workflowHelpers).getCurrentWorkflow.mockReturnValue({
name: 'Test Workflow',
} as Workflow);
vi.mocked(workflowHelpers).getWorkflowDataToSave.mockResolvedValue({
id: 'workflowId',
nodes: [],
} as unknown as IWorkflowData);
vi.mocked(workflowHelpers).getCurrentWorkflow.mockReturnValue(workflow);
vi.mocked(workflowHelpers).getWorkflowDataToSave.mockResolvedValue(
mock<IWorkflowData>({ id: 'workflowId', nodes: [] }),
);
vi.mocked(workflowsStore).getWorkflowRunData = mockRunData;
// ACT
const result = await runWorkflow({ destinationNode: 'some node name' });
// ASSERT
expect(result).toEqual(mockExecutionResponse);
expect(workflowsStore.setWorkflowExecutionData).toHaveBeenCalledTimes(1);
expect(workflowsStore.setWorkflowExecutionData).toHaveBeenCalledWith(dataCaptor);
expect(dataCaptor.value).toMatchObject({ data: { resultData: { runData: mockRunData } } });
});
it("does not send run data if it's not a partial execution even if `PartialExecution.version` is set to 1", async () => {
// ARRANGE
const mockExecutionResponse = { executionId: '123' };
const mockRunData = { nodeName: [] };
const { runWorkflow } = useRunWorkflow({ router });
const dataCaptor = captor();
const workflow = mock<Workflow>({ name: 'Test Workflow' });
workflow.getParentNodes.mockReturnValue([]);
vi.mocked(useLocalStorage).mockReturnValueOnce(ref(1));
vi.mocked(rootStore).pushConnectionActive = true;
vi.mocked(workflowsStore).runWorkflow.mockResolvedValue(mockExecutionResponse);
vi.mocked(workflowsStore).nodesIssuesExist = false;
vi.mocked(workflowHelpers).getCurrentWorkflow.mockReturnValue(workflow);
vi.mocked(workflowHelpers).getWorkflowDataToSave.mockResolvedValue(
mock<IWorkflowData>({ id: 'workflowId', nodes: [] }),
);
vi.mocked(workflowsStore).getWorkflowRunData = mockRunData;
// ACT
@ -458,10 +488,9 @@ describe('useRunWorkflow({ router })', () => {
// ASSERT
expect(result).toEqual(mockExecutionResponse);
expect(workflowsStore.setWorkflowExecutionData).toHaveBeenCalledTimes(1);
expect(vi.mocked(workflowsStore.setWorkflowExecutionData).mock.calls[0][0]).toMatchObject({
data: { resultData: { runData: mockRunData } },
});
expect(workflowsStore.runWorkflow).toHaveBeenCalledTimes(1);
expect(workflowsStore.runWorkflow).toHaveBeenCalledWith(dataCaptor);
expect(dataCaptor.value).toHaveProperty('runData', undefined);
});
});

View file

@ -264,14 +264,23 @@ export function useRunWorkflow(useRunWorkflowOpts: { router: ReturnType<typeof u
// 0 is the old flow
// 1 is the new flow
const partialExecutionVersion = useLocalStorage('PartialExecution.version', -1);
// partial executions must have a destination node
const isPartialExecution = options.destinationNode !== undefined;
const startRunData: IStartRunData = {
workflowData,
// With the new partial execution version the backend decides what run
// data to use and what to ignore.
runData: partialExecutionVersion.value === 1 ? (runData ?? undefined) : newRunData,
runData: !isPartialExecution
? // if it's a full execution we don't want to send any run data
undefined
: partialExecutionVersion.value === 1
? // With the new partial execution version the backend decides
//what run data to use and what to ignore.
(runData ?? undefined)
: // for v0 we send the run data the FE constructed
newRunData,
startNodes,
triggerToStartFrom,
};
if ('destinationNode' in options) {
startRunData.destinationNode = options.destinationNode;
}

View file

@ -907,3 +907,5 @@ export const APP_MODALS_ELEMENT_ID = 'app-modals';
export const NEW_SAMPLE_WORKFLOW_CREATED_CHANNEL = 'new-sample-sub-workflow-created';
export const AI_NODES_PACKAGE_NAME = '@n8n/n8n-nodes-langchain';
export const AI_ASSISTANT_MAX_CONTENT_LENGTH = 100; // in kilobytes

View file

@ -155,7 +155,8 @@
"aiAssistant.newSessionModal.message": "You already have an active AI Assistant session. Starting a new session will clear your current conversation history.",
"aiAssistant.newSessionModal.question": "Are you sure you want to start a new session?",
"aiAssistant.newSessionModal.confirm": "Start new session",
"aiAssistant.serviceError.message": "Unable to connect to n8n's AI service",
"aiAssistant.serviceError.message": "Unable to connect to n8n's AI service ({message})",
"aiAssistant.payloadTooBig.message": "Payload size is too large",
"aiAssistant.codeUpdated.message.title": "Assistant modified workflow",
"aiAssistant.codeUpdated.message.body1": "Open the",
"aiAssistant.codeUpdated.message.body2": "node to see the changes",

View file

@ -283,7 +283,7 @@ export const useAssistantStore = defineStore(STORES.ASSISTANT, () => {
stopStreaming();
assistantThinkingMessage.value = undefined;
addAssistantError(
`${locale.baseText('aiAssistant.serviceError.message')}: (${e.message})`,
locale.baseText('aiAssistant.serviceError.message', { interpolate: { message: e.message } }),
id,
retry,
);
@ -487,24 +487,25 @@ export const useAssistantStore = defineStore(STORES.ASSISTANT, () => {
openChat();
streaming.value = true;
const payload: ChatRequest.RequestPayload['payload'] = {
role: 'user',
type: 'init-error-helper',
user: {
firstName: usersStore.currentUser?.firstName ?? '',
},
error: context.error,
node: assistantHelpers.processNodeForAssistant(context.node, [
'position',
'parameters.notice',
]),
nodeInputData,
executionSchema: schemas,
authType,
};
chatWithAssistant(
rootStore.restApiContext,
{
payload: {
role: 'user',
type: 'init-error-helper',
user: {
firstName: usersStore.currentUser?.firstName ?? '',
},
error: context.error,
node: assistantHelpers.processNodeForAssistant(context.node, [
'position',
'parameters.notice',
]),
nodeInputData,
executionSchema: schemas,
authType,
},
payload,
},
(msg) => onEachStreamingMessage(msg, id),
() => onDoneStreaming(id),

View file

@ -58,7 +58,7 @@ export namespace ChatRequest {
user: {
firstName: string;
};
context?: UserContext;
context?: UserContext & WorkflowContext;
workflowContext?: WorkflowContext;
question: string;
}

View file

@ -1,4 +1,4 @@
import { STREAM_SEPERATOR, streamRequest } from './apiUtils';
import { ResponseError, STREAM_SEPERATOR, streamRequest } from './apiUtils';
describe('streamRequest', () => {
it('should stream data from the API endpoint', async () => {
@ -54,6 +54,54 @@ describe('streamRequest', () => {
expect(onErrorMock).not.toHaveBeenCalled();
});
it('should stream error response from the API endpoint', async () => {
const testError = { code: 500, message: 'Error happened' };
const encoder = new TextEncoder();
const mockResponse = new ReadableStream({
start(controller) {
controller.enqueue(encoder.encode(JSON.stringify(testError)));
controller.close();
},
});
const mockFetch = vi.fn().mockResolvedValue({
ok: false,
body: mockResponse,
});
global.fetch = mockFetch;
const onChunkMock = vi.fn();
const onDoneMock = vi.fn();
const onErrorMock = vi.fn();
await streamRequest(
{
baseUrl: 'https://api.example.com',
pushRef: '',
},
'/data',
{ key: 'value' },
onChunkMock,
onDoneMock,
onErrorMock,
);
expect(mockFetch).toHaveBeenCalledWith('https://api.example.com/data', {
method: 'POST',
body: JSON.stringify({ key: 'value' }),
credentials: 'include',
headers: {
'Content-Type': 'application/json',
'browser-id': expect.stringContaining('-'),
},
});
expect(onChunkMock).not.toHaveBeenCalled();
expect(onErrorMock).toHaveBeenCalledTimes(1);
expect(onErrorMock).toHaveBeenCalledWith(new ResponseError(testError.message));
});
it('should handle broken stream data', async () => {
const encoder = new TextEncoder();
const mockResponse = new ReadableStream({

View file

@ -198,7 +198,7 @@ export function unflattenExecutionData(fullExecutionData: IExecutionFlattedRespo
return returnData;
}
export async function streamRequest<T>(
export async function streamRequest<T extends object>(
context: IRestApiContext,
apiEndpoint: string,
payload: object,
@ -220,7 +220,7 @@ export async function streamRequest<T>(
try {
const response = await fetch(`${context.baseUrl}${apiEndpoint}`, assistantRequest);
if (response.ok && response.body) {
if (response.body) {
// Handle the streaming response
const reader = response.body.getReader();
const decoder = new TextDecoder('utf-8');
@ -252,7 +252,18 @@ export async function streamRequest<T>(
}
try {
onChunk?.(data);
if (response.ok) {
// Call chunk callback if request was successful
onChunk?.(data);
} else {
// Otherwise, call error callback
const message = 'message' in data ? data.message : response.statusText;
onError?.(
new ResponseError(String(message), {
httpStatusCode: response.status,
}),
);
}
} catch (e: unknown) {
if (e instanceof Error) {
onError?.(e);

View file

@ -1,4 +1,4 @@
import { isObjectOrArray, isObject, searchInObject } from '@/utils/objectUtils';
import { isObjectOrArray, isObject, searchInObject, getObjectSizeInKB } from '@/utils/objectUtils';
const testData = [1, '', true, null, undefined, new Date(), () => {}].map((value) => [
value,
@ -95,4 +95,63 @@ describe('objectUtils', () => {
assert(searchInObject({ a: ['b', { c: 'd' }] }, 'd'));
});
});
describe('getObjectSizeInKB', () => {
// Test null/undefined cases
it('returns 0 for null', () => {
expect(getObjectSizeInKB(null)).toBe(0);
});
it('returns 0 for undefined', () => {
expect(getObjectSizeInKB(undefined)).toBe(0);
});
// Test empty objects/arrays
it('returns correct size for empty object', () => {
expect(getObjectSizeInKB({})).toBe(0);
});
it('returns correct size for empty array', () => {
expect(getObjectSizeInKB([])).toBe(0);
});
// Test regular cases
it('calculates size for simple object correctly', () => {
const obj = { name: 'test' };
expect(getObjectSizeInKB(obj)).toBe(0.01);
});
it('calculates size for array correctly', () => {
const arr = [1, 2, 3];
expect(getObjectSizeInKB(arr)).toBe(0.01);
});
it('calculates size for nested object correctly', () => {
const obj = {
name: 'test',
nested: {
value: 123,
},
};
expect(getObjectSizeInKB(obj)).toBe(0.04);
});
// Test error cases
it('throws error for circular reference', () => {
type CircularObj = {
name: string;
self?: CircularObj;
};
const obj: CircularObj = { name: 'test' };
obj.self = obj;
expect(() => getObjectSizeInKB(obj)).toThrow('Failed to calculate object size');
});
it('handles special characters correctly', () => {
const obj = { name: '测试' };
expect(getObjectSizeInKB(obj)).toBe(0.02);
});
});
});

View file

@ -18,3 +18,35 @@ export const searchInObject = (obj: ObjectOrArray, searchString: string): boolea
? searchInObject(entry, searchString)
: entry?.toString().toLowerCase().includes(searchString.toLowerCase()),
);
/**
* Calculate the size of a stringified object in KB.
* @param {unknown} obj - The object to calculate the size of
* @returns {number} The size of the object in KB
* @throws {Error} If the object is not serializable
*/
export const getObjectSizeInKB = (obj: unknown): number => {
if (obj === null || obj === undefined) {
return 0;
}
if (
(typeof obj === 'object' && Object.keys(obj).length === 0) ||
(Array.isArray(obj) && obj.length === 0)
) {
// "{}" and "[]" both take 2 bytes in UTF-8
return Number((2 / 1024).toFixed(2));
}
try {
const str = JSON.stringify(obj);
// Using TextEncoder to get actual UTF-8 byte length (what we see in chrome dev tools)
const bytes = new TextEncoder().encode(str).length;
const kb = bytes / 1024;
return Number(kb.toFixed(2));
} catch (error) {
throw new Error(
`Failed to calculate object size: ${error instanceof Error ? error.message : 'Unknown error'}`,
);
}
};

View file

@ -1,6 +1,6 @@
{
"name": "n8n-node-dev",
"version": "1.75.0",
"version": "1.76.0",
"description": "CLI to simplify n8n credentials/node development",
"main": "dist/src/index",
"types": "dist/src/index.d.ts",

View file

@ -138,6 +138,7 @@ export type AwsCredentialsType = {
sesEndpoint?: string;
sqsEndpoint?: string;
s3Endpoint?: string;
ssmEndpoint?: string;
};
// Some AWS services are global and don't have a region
@ -294,6 +295,19 @@ export class Aws implements ICredentialType {
default: '',
placeholder: 'https://s3.{region}.amazonaws.com',
},
{
displayName: 'SSM Endpoint',
name: 'ssmEndpoint',
description: 'Endpoint for AWS Systems Manager (SSM)',
type: 'string',
displayOptions: {
show: {
customEndpoints: [true],
},
},
default: '',
placeholder: 'https://ssm.{region}.amazonaws.com',
},
];
async authenticate(
@ -356,6 +370,8 @@ export class Aws implements ICredentialType {
endpointString = credentials.sqsEndpoint;
} else if (service) {
endpointString = `https://${service}.${region}.amazonaws.com`;
} else if (service === 'ssm' && credentials.ssmEndpoint) {
endpointString = credentials.ssmEndpoint;
}
endpoint = new URL(endpointString!.replace('{region}', region) + path);
} else {

View file

@ -9,7 +9,7 @@ export class JenkinsApi implements ICredentialType {
properties: INodeProperties[] = [
{
displayName: 'Jenking Username',
displayName: 'Jenkins Username',
name: 'username',
type: 'string',
default: '',

View file

@ -0,0 +1,47 @@
import type {
IAuthenticateGeneric,
ICredentialTestRequest,
ICredentialType,
INodeProperties,
} from 'n8n-workflow';
export class JiraSoftwareServerPatApi implements ICredentialType {
name = 'jiraSoftwareServerPatApi';
displayName = 'Jira SW Server (PAT) API';
documentationUrl = 'jira';
properties: INodeProperties[] = [
{
displayName: 'Personal Access Token',
name: 'personalAccessToken',
typeOptions: { password: true },
type: 'string',
default: '',
},
{
displayName: 'Domain',
name: 'domain',
type: 'string',
default: '',
placeholder: 'https://example.com',
},
];
authenticate: IAuthenticateGeneric = {
type: 'generic',
properties: {
headers: {
Authorization: '=Bearer {{$credentials.personalAccessToken}}',
},
},
};
test: ICredentialTestRequest = {
request: {
baseURL: '={{$credentials?.domain}}',
url: '/rest/api/2/project',
},
};
}

View file

@ -0,0 +1,61 @@
import type { Icon, ICredentialType, INodeProperties } from 'n8n-workflow';
export class MiroOAuth2Api implements ICredentialType {
name = 'miroOAuth2Api';
extends = ['oAuth2Api'];
displayName = 'Miro OAuth2 API';
documentationUrl = 'miro';
icon: Icon = 'file:icons/Miro.svg';
httpRequestNode = {
name: 'Miro',
docsUrl: 'https://developers.miro.com/reference/overview',
apiBaseUrl: 'https://api.miro.com/v2/',
};
properties: INodeProperties[] = [
{
displayName: 'Grant Type',
name: 'grantType',
type: 'hidden',
default: 'authorizationCode',
},
{
displayName: 'Authorization URL',
name: 'authUrl',
type: 'hidden',
default: 'https://miro.com/oauth/authorize',
required: true,
},
{
displayName: 'Access Token URL',
name: 'accessTokenUrl',
type: 'hidden',
default: 'https://api.miro.com/v1/oauth/token',
required: true,
},
{
displayName: 'Scope',
name: 'scope',
type: 'hidden',
default: '',
required: true,
},
{
displayName: 'Auth URI Query Parameters',
name: 'authQueryParameters',
type: 'hidden',
default: '',
},
{
displayName: 'Authentication',
name: 'authentication',
type: 'hidden',
default: 'body',
},
];
}

View file

@ -0,0 +1,22 @@
<svg version="1.1" id="Layer_1" xmlns:x="ns_extend;" xmlns:i="ns_ai;" xmlns:graph="ns_graphs;" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 48 48" style="enable-background:new 0 0 48 48;" xml:space="preserve">
<style type="text/css">
.st0{fill:#FFD02F;}
.st1{fill-rule:evenodd;clip-rule:evenodd;fill:#050038;}
</style>
<metadata>
<sfw xmlns="ns_sfw;">
<slices>
</slices>
<sliceSourceBounds bottomLeftOrigin="true" height="48" width="48" x="175.8" y="-224.2">
</sliceSourceBounds>
</sfw>
</metadata>
<g>
<path class="st0" d="M10.4,0h27.2C43.3,0,48,4.7,48,10.4v27.2C48,43.3,43.3,48,37.6,48H10.4C4.7,48,0,43.3,0,37.6V10.4
C0,4.7,4.7,0,10.4,0z">
</path>
<path class="st1" d="M33.3,6h-5.3l4.4,7.7L22.8,6h-5.3l4.8,9.4L12.3,6H7l5.3,12L7,42h5.3l10.1-25.7L17.5,42h5.3l9.7-27.4L28.1,42
h5.3L43,12L33.3,6z">
</path>
</g>
</svg>

After

Width:  |  Height:  |  Size: 915 B

View file

@ -2,6 +2,12 @@ import type { INodeProperties } from 'n8n-workflow';
import { appendAttributionOption } from '../../utils/descriptions';
export const placeholder: string = `
<!-- Your custom HTML here --->
`.trimStart();
export const webhookPath: INodeProperties = {
displayName: 'Form Path',
name: 'path',
@ -36,9 +42,9 @@ export const formDescription: INodeProperties = {
};
export const formFields: INodeProperties = {
displayName: 'Form Fields',
displayName: 'Form Elements',
name: 'formFields',
placeholder: 'Add Form Field',
placeholder: 'Add Form Element',
type: 'fixedCollection',
default: { values: [{ label: '', fieldType: 'text' }] },
typeOptions: {
@ -60,12 +66,16 @@ export const formFields: INodeProperties = {
required: true,
},
{
displayName: 'Field Type',
displayName: 'Element Type',
name: 'fieldType',
type: 'options',
default: 'text',
description: 'The type of field to add to the form',
options: [
{
name: 'Custom HTML',
value: 'html',
},
{
name: 'Date',
value: 'date',
@ -109,7 +119,7 @@ export const formFields: INodeProperties = {
default: '',
displayOptions: {
hide: {
fieldType: ['dropdown', 'date', 'file'],
fieldType: ['dropdown', 'date', 'file', 'html'],
},
},
},
@ -158,6 +168,21 @@ export const formFields: INodeProperties = {
},
},
},
{
displayName: 'HTML Template',
name: 'html',
typeOptions: {
editor: 'htmlEditor',
},
type: 'string',
default: placeholder,
description: 'HTML template to render',
displayOptions: {
show: {
fieldType: ['html'],
},
},
},
{
displayName: 'Multiple Files',
name: 'multipleFiles',
@ -190,6 +215,23 @@ export const formFields: INodeProperties = {
name: 'formatDate',
type: 'notice',
default: '',
displayOptions: {
show: {
fieldType: ['date'],
},
},
},
{
displayName:
'Does not accept <code>&lt;style&gt;</code> <code>&lt;script&gt;</code> or <code>&lt;input&gt;</code> tags.',
name: 'htmlTips',
type: 'notice',
default: '',
displayOptions: {
show: {
fieldType: ['html'],
},
},
},
{
displayName: 'Required Field',
@ -198,6 +240,11 @@ export const formFields: INodeProperties = {
default: false,
description:
'Whether to require the user to enter a value for this field before submitting the form',
displayOptions: {
hide: {
fieldType: ['html'],
},
},
},
],
},

View file

@ -15,6 +15,7 @@ import {
prepareFormReturnItem,
resolveRawData,
isFormConnected,
sanitizeHtml,
} from '../utils';
describe('FormTrigger, parseFormDescription', () => {
@ -42,6 +43,29 @@ describe('FormTrigger, parseFormDescription', () => {
});
});
describe('FormTrigger, sanitizeHtml', () => {
it('should remove forbidden HTML tags', () => {
const givenHtml = [
{
html: '<script>alert("hello world")</script>',
expected: '',
},
{
html: '<style>body { color: red; }</style>',
expected: '',
},
{
html: '<input type="text" value="test">',
expected: '',
},
];
givenHtml.forEach(({ html, expected }) => {
expect(sanitizeHtml(html)).toBe(expected);
});
});
});
describe('FormTrigger, formWebhook', () => {
const executeFunctions = mock<IWebhookFunctions>();
executeFunctions.getNode.mockReturnValue({ typeVersion: 2.1 } as any);
@ -80,6 +104,12 @@ describe('FormTrigger, formWebhook', () => {
acceptFileTypes: '.pdf,.doc',
multipleFiles: false,
},
{
fieldLabel: 'Custom HTML',
fieldType: 'html',
html: '<div>Test HTML</div>',
requiredField: false,
},
];
executeFunctions.getNodeParameter.calledWith('formFields.values').mockReturnValue(formFields);
@ -134,6 +164,16 @@ describe('FormTrigger, formWebhook', () => {
multipleFiles: '',
placeholder: undefined,
},
{
id: 'field-4',
errorId: 'error-field-4',
label: 'Custom HTML',
inputRequired: '',
defaultValue: '',
placeholder: undefined,
html: '<div>Test HTML</div>',
isHtml: true,
},
],
formSubmittedText: 'Your response has been recorded',
formTitle: 'Test Form',

View file

@ -24,11 +24,16 @@ import { getResolvables } from '../../utils/utilities';
import { WebhookAuthorizationError } from '../Webhook/error';
import { validateWebhookAuthentication } from '../Webhook/utils';
function sanitizeHtml(text: string) {
export function sanitizeHtml(text: string) {
return sanitize(text, {
allowedTags: [
'b',
'div',
'i',
'iframe',
'img',
'video',
'source',
'em',
'strong',
'a',
@ -48,8 +53,18 @@ function sanitizeHtml(text: string) {
],
allowedAttributes: {
a: ['href', 'target', 'rel'],
img: ['src', 'alt', 'width', 'height'],
video: ['*'],
iframe: ['*'],
source: ['*'],
},
transformTags: {
iframe: sanitize.simpleTransform('iframe', {
sandbox: '',
referrerpolicy: 'strict-origin-when-cross-origin',
allow: 'fullscreen; autoplay; encrypted-media',
}),
},
nonBooleanAttributes: ['*'],
});
}
@ -149,6 +164,9 @@ export function prepareFormData({
input.selectOptions = fieldOptions.map((e) => e.option);
} else if (fieldType === 'textarea') {
input.isTextarea = true;
} else if (fieldType === 'html') {
input.isHtml = true;
input.html = field.html as string;
} else {
input.isInput = true;
input.type = fieldType as 'text' | 'number' | 'date' | 'email';
@ -409,7 +427,14 @@ export async function formWebhook(
}
const mode = context.getMode() === 'manual' ? 'test' : 'production';
const formFields = context.getNodeParameter('formFields.values', []) as FormFieldsParameter;
const formFields = (context.getNodeParameter('formFields.values', []) as FormFieldsParameter).map(
(field) => {
if (field.fieldType === 'html') {
field.html = sanitizeHtml(field.html as string);
}
return field;
},
);
const method = context.getRequestObject().method;
checkResponseModeConfiguration(context);

View file

@ -28,6 +28,9 @@ export async function jiraSoftwareCloudApiRequest(
if (jiraVersion === 'server') {
domain = (await this.getCredentials('jiraSoftwareServerApi')).domain as string;
credentialType = 'jiraSoftwareServerApi';
} else if (jiraVersion === 'serverPat') {
domain = (await this.getCredentials('jiraSoftwareServerPatApi')).domain as string;
credentialType = 'jiraSoftwareServerPatApi';
} else {
domain = (await this.getCredentials('jiraSoftwareCloudApi')).domain as string;
credentialType = 'jiraSoftwareCloudApi';
@ -233,7 +236,7 @@ export async function getUsers(this: ILoadOptionsFunctions): Promise<INodeProper
const query: IDataObject = { maxResults };
let endpoint = '/api/2/users/search';
if (jiraVersion === 'server') {
if (jiraVersion === 'server' || jiraVersion === 'serverPat') {
endpoint = '/api/2/user/search';
query.username = "'";
}

View file

@ -67,6 +67,15 @@ export class Jira implements INodeType {
},
},
},
{
name: 'jiraSoftwareServerPatApi',
required: true,
displayOptions: {
show: {
jiraVersion: ['serverPat'],
},
},
},
],
properties: [
{
@ -82,6 +91,10 @@ export class Jira implements INodeType {
name: 'Server (Self Hosted)',
value: 'server',
},
{
name: 'Server Pat (Self Hosted)',
value: 'serverPat',
},
],
default: 'cloud',
},
@ -139,7 +152,7 @@ export class Jira implements INodeType {
let endpoint = '';
let projects;
if (jiraVersion === 'server') {
if (jiraVersion === 'server' || jiraVersion === 'serverPat') {
endpoint = '/api/2/project';
projects = await jiraSoftwareCloudApiRequest.call(this, endpoint, 'GET');
} else {
@ -276,8 +289,12 @@ export class Jira implements INodeType {
async getCustomFields(this: ILoadOptionsFunctions): Promise<INodeListSearchResult> {
const returnData: INodeListSearchItems[] = [];
const operation = this.getCurrentNodeParameter('operation') as string;
const jiraVersion = this.getNodeParameter('jiraVersion', 0) as string;
let projectId: string;
let issueTypeId: string;
let issueId: string = ''; // /editmeta endpoint requires issueId
if (operation === 'create') {
projectId = this.getCurrentNodeParameter('project', { extractValue: true }) as string;
issueTypeId = this.getCurrentNodeParameter('issueType', { extractValue: true }) as string;
@ -292,6 +309,26 @@ export class Jira implements INodeType {
);
projectId = res.fields.project.id;
issueTypeId = res.fields.issuetype.id;
issueId = res.id;
}
if (jiraVersion === 'server' && operation === 'update' && issueId) {
// https://developer.atlassian.com/server/jira/platform/jira-rest-api-example-edit-issues-6291632/?utm_source=chatgpt.com
const { fields } = await jiraSoftwareCloudApiRequest.call(
this,
`/api/2/issue/${issueId}/editmeta`,
'GET',
);
for (const field of Object.keys(fields || {})) {
if (field.startsWith('customfield_')) {
returnData.push({
name: fields[field].name,
value: field,
});
}
}
return { results: returnData };
}
const res = await jiraSoftwareCloudApiRequest.call(
@ -478,7 +515,7 @@ export class Jira implements INodeType {
};
}
if (additionalFields.assignee) {
if (jiraVersion === 'server') {
if (jiraVersion === 'server' || jiraVersion === 'serverPat') {
fields.assignee = {
name: additionalFields.assignee as string,
};
@ -489,7 +526,7 @@ export class Jira implements INodeType {
}
}
if (additionalFields.reporter) {
if (jiraVersion === 'server') {
if (jiraVersion === 'server' || jiraVersion === 'serverPat') {
fields.reporter = {
name: additionalFields.reporter as string,
};
@ -608,7 +645,7 @@ export class Jira implements INodeType {
};
}
if (updateFields.assignee) {
if (jiraVersion === 'server') {
if (jiraVersion === 'server' || jiraVersion === 'serverPat') {
fields.assignee = {
name: updateFields.assignee as string,
};
@ -619,7 +656,7 @@ export class Jira implements INodeType {
}
}
if (updateFields.reporter) {
if (jiraVersion === 'server') {
if (jiraVersion === 'server' || jiraVersion === 'serverPat') {
fields.reporter = {
name: updateFields.reporter as string,
};
@ -1001,7 +1038,8 @@ export class Jira implements INodeType {
}
}
if (resource === 'issueAttachment') {
const apiVersion = jiraVersion === 'server' ? '2' : ('3' as string);
const apiVersion =
jiraVersion === 'server' || jiraVersion === 'serverPat' ? '2' : ('3' as string);
//https://developer.atlassian.com/cloud/jira/platform/rest/v3/api-group-issue-attachments/#api-rest-api-3-issue-issueidorkey-attachments-post
if (operation === 'add') {
@ -1159,7 +1197,8 @@ export class Jira implements INodeType {
}
if (resource === 'issueComment') {
let apiVersion = jiraVersion === 'server' ? '2' : ('3' as string);
let apiVersion =
jiraVersion === 'server' || jiraVersion === 'serverPat' ? '2' : ('3' as string);
//https://developer.atlassian.com/cloud/jira/platform/rest/v3/api-group-issue-comments/#api-rest-api-3-issue-issueidorkey-comment-post
if (operation === 'add') {
@ -1181,7 +1220,7 @@ export class Jira implements INodeType {
Object.assign(body, options);
if (!jsonParameters) {
const comment = this.getNodeParameter('comment', i) as string;
if (jiraVersion === 'server' || options.wikiMarkup) {
if (jiraVersion === 'server' || jiraVersion === 'serverPat' || options.wikiMarkup) {
Object.assign(body, { body: comment });
} else {
Object.assign(body, {
@ -1332,7 +1371,7 @@ export class Jira implements INodeType {
Object.assign(qs, options);
if (!jsonParameters) {
const comment = this.getNodeParameter('comment', i) as string;
if (jiraVersion === 'server' || options.wikiMarkup) {
if (jiraVersion === 'server' || jiraVersion === 'serverPat' || options.wikiMarkup) {
Object.assign(body, { body: comment });
} else {
Object.assign(body, {
@ -1383,7 +1422,8 @@ export class Jira implements INodeType {
}
if (resource === 'user') {
const apiVersion = jiraVersion === 'server' ? '2' : ('3' as string);
const apiVersion =
jiraVersion === 'server' || jiraVersion === 'serverPat' ? '2' : ('3' as string);
if (operation === 'create') {
// https://developer.atlassian.com/cloud/jira/platform/rest/v3/api-group-users/#api-rest-api-3-user-post

View file

@ -45,6 +45,16 @@ export class JiraTrigger implements INodeType {
},
},
},
{
displayName: 'Credentials to Connect to Jira',
name: 'jiraSoftwareServerPatApi',
required: true,
displayOptions: {
show: {
jiraVersion: ['serverPat'],
},
},
},
{
// eslint-disable-next-line n8n-nodes-base/node-class-description-credentials-name-unsuffixed
name: 'httpQueryAuth',
@ -87,6 +97,10 @@ export class JiraTrigger implements INodeType {
name: 'Server (Self Hosted)',
value: 'server',
},
{
name: 'Server (Pat) (Self Hosted)',
value: 'serverPat',
},
],
default: 'cloud',
},

View file

@ -0,0 +1,126 @@
import type { MockProxy } from 'jest-mock-extended';
import { mock } from 'jest-mock-extended';
import type { IHttpRequestMethods, ILoadOptionsFunctions } from 'n8n-workflow';
import { Jira } from '../Jira.node';
const ISSUE_KEY = 'KEY-1';
jest.mock('../GenericFunctions', () => {
const originalModule = jest.requireActual('../GenericFunctions');
return {
...originalModule,
jiraSoftwareCloudApiRequest: jest.fn(async function (
endpoint: string,
method: IHttpRequestMethods,
) {
if (method === 'GET' && endpoint === `/api/2/issue/${ISSUE_KEY}`) {
return {
id: 10000,
fields: {
project: {
id: 10001,
},
issuetype: {
id: 10002,
},
},
};
} else if (method === 'GET' && endpoint === '/api/2/issue/10000/editmeta') {
return {
fields: {
customfield_123: {
name: 'Field 123',
},
customfield_456: {
name: 'Field 456',
},
},
};
} else if (
method === 'GET' &&
endpoint ===
'/api/2/issue/createmeta?projectIds=10001&issueTypeIds=10002&expand=projects.issuetypes.fields'
) {
return {
projects: [
{
id: 10001,
issuetypes: [
{
id: 10002,
fields: {
customfield_abc: {
name: 'Field ABC',
schema: { customId: 'customfield_abc' },
fieldId: 'customfield_abc',
},
customfield_def: {
name: 'Field DEF',
schema: { customId: 'customfield_def' },
fieldId: 'customfield_def',
},
},
},
],
},
],
};
}
}),
};
});
describe('Jira Node, methods', () => {
let jira: Jira;
let loadOptionsFunctions: MockProxy<ILoadOptionsFunctions>;
beforeEach(() => {
jira = new Jira();
loadOptionsFunctions = mock<ILoadOptionsFunctions>();
});
describe('listSearch.getCustomFields', () => {
it('should call correct endpoint and return custom fields for server version', async () => {
loadOptionsFunctions.getCurrentNodeParameter.mockReturnValueOnce('update');
loadOptionsFunctions.getNodeParameter.mockReturnValue('server');
loadOptionsFunctions.getCurrentNodeParameter.mockReturnValueOnce(ISSUE_KEY);
const { results } = await jira.methods.listSearch.getCustomFields.call(
loadOptionsFunctions as ILoadOptionsFunctions,
);
expect(results).toEqual([
{
name: 'Field 123',
value: 'customfield_123',
},
{
name: 'Field 456',
value: 'customfield_456',
},
]);
});
it('should call correct endpoint and return custom fields for cloud version', async () => {
loadOptionsFunctions.getCurrentNodeParameter.mockReturnValueOnce('update');
loadOptionsFunctions.getNodeParameter.mockReturnValue('cloud');
loadOptionsFunctions.getCurrentNodeParameter.mockReturnValueOnce(ISSUE_KEY);
const { results } = await jira.methods.listSearch.getCustomFields.call(
loadOptionsFunctions as ILoadOptionsFunctions,
);
expect(results).toEqual([
{
name: 'Field ABC',
value: 'customfield_abc',
},
{
name: 'Field DEF',
value: 'customfield_def',
},
]);
});
});
});

View file

@ -1,6 +1,6 @@
{
"name": "n8n-nodes-base",
"version": "1.75.0",
"version": "1.76.0",
"description": "Base nodes of n8n",
"main": "index.js",
"scripts": {
@ -181,6 +181,7 @@
"dist/credentials/JenkinsApi.credentials.js",
"dist/credentials/JiraSoftwareCloudApi.credentials.js",
"dist/credentials/JiraSoftwareServerApi.credentials.js",
"dist/credentials/JiraSoftwareServerPatApi.credentials.js",
"dist/credentials/JotFormApi.credentials.js",
"dist/credentials/JwtAuth.credentials.js",
"dist/credentials/Kafka.credentials.js",
@ -231,6 +232,7 @@
"dist/credentials/MicrosoftToDoOAuth2Api.credentials.js",
"dist/credentials/MindeeInvoiceApi.credentials.js",
"dist/credentials/MindeeReceiptApi.credentials.js",
"dist/credentials/MiroOAuth2Api.credentials.js",
"dist/credentials/MispApi.credentials.js",
"dist/credentials/MistApi.credentials.js",
"dist/credentials/MoceanApi.credentials.js",

View file

@ -1 +1,6 @@
import 'reflect-metadata';
// Disable task runners until we have fixed the "run test workflows" test
// to mock the Code Node execution
process.env.N8N_RUNNERS_ENABLED = 'false';
process.env.N8N_ENFORCE_SETTINGS_FILE_PERMISSIONS = 'false';

View file

@ -1,6 +1,6 @@
{
"name": "n8n-workflow",
"version": "1.74.0",
"version": "1.75.0",
"description": "Workflow base code of n8n",
"main": "dist/index.js",
"module": "src/index.ts",

View file

@ -36,6 +36,7 @@ export function augmentArray<T>(data: T[]): T[] {
return Reflect.deleteProperty(getData(), key);
},
get(target, key: string, receiver): unknown {
if (key === 'constructor') return Array;
const value = Reflect.get(newData ?? target, key, receiver) as unknown;
const newValue = augment(value);
if (newValue !== value) {
@ -83,6 +84,8 @@ export function augmentObject<T extends object>(data: T): T {
const proxy = new Proxy(data, {
get(target, key: string, receiver): unknown {
if (key === 'constructor') return Object;
if (deletedProperties.has(key)) {
return undefined;
}

View file

@ -2684,6 +2684,7 @@ export type FormFieldsParameter = Array<{
multipleFiles?: boolean;
acceptFileTypes?: string;
formatDate?: string;
html?: string;
placeholder?: string;
}>;

View file

@ -10,6 +10,8 @@ describe('AugmentObject', () => {
const augmentedObject = augmentArray(originalObject);
expect(augmentedObject.constructor.name).toEqual('Array');
expect(augmentedObject.push(5)).toEqual(6);
expect(augmentedObject).toEqual([1, 2, 3, 4, null, 5]);
expect(originalObject).toEqual(copyOriginal);
@ -207,6 +209,8 @@ describe('AugmentObject', () => {
const augmentedObject = augmentObject(originalObject);
expect(augmentedObject.constructor.name).toEqual('Object');
augmentedObject[1] = 911;
expect(originalObject[1]).toEqual(11);
expect(augmentedObject[1]).toEqual(911);
@ -589,5 +593,29 @@ describe('AugmentObject', () => {
delete augmentedObject.toString;
expect(augmentedObject.toString).toBeUndefined();
});
test('should handle constructor property correctly', () => {
const originalObject: any = {
a: {
b: {
c: {
d: '4',
},
},
},
};
const augmentedObject = augmentObject(originalObject);
expect(augmentedObject.constructor.name).toEqual('Object');
expect(augmentedObject.a.constructor.name).toEqual('Object');
expect(augmentedObject.a.b.constructor.name).toEqual('Object');
expect(augmentedObject.a.b.c.constructor.name).toEqual('Object');
augmentedObject.constructor = {};
expect(augmentedObject.constructor.name).toEqual('Object');
delete augmentedObject.constructor;
expect(augmentedObject.constructor.name).toEqual('Object');
});
});
});