DTO irregular constraint array

I am new to NestJS, and I created a new DTO for my controller and adding it to CustomValidationPipe that extends from ValidationPipe and implements PipeTransform. The dto is as defined below :

export class _sample_object{
  something: string; 
  optional_param_1?: string; 
}
export class AddFrameworkDto {
  
  @MaxLength(Number(NAME.MAX_LENGTH), {message: `should not be more than ${NAME.MAX_LENGTH}`})
  @MinLength(Number(NAME.MIN_LENGTH), {message: `should not be less than ${NAME.MIN_LENGTH}`})
  @IsString({message: 'is not valid'})
  @IsNotEmpty({ message: 'is not empty' })
  name: string;


  @MinLength(Number(NAME.MIN_LENGTH), {message: `should not be less than ${NAME.MIN_LENGTH}`})
  @MaxLength(Number(NAME.MAX_LENGTH), {message: `should not be more than ${NAME.MAX_LENGTH}`})
  @IsString({message: 'is not valid'})
  @ValidateIf((object, value) => value !== undefined)
  description?: string;


  @IsObject({message: 'is not valid'})
  @IsNotEmpty( {message: "should not be empty"})
  @ValidateNested({
    each: true
  })
  @Type(()=> _sample_object)
  sample_object: _sample_object

}

Now, once I make a request via controller, my CustomValidationPipe is triggered where I am handling the message such as below

import { ArgumentMetadata, Injectable, PipeTransform, ValidationPipe } from '@nestjs/common';
import { plainToClass } from 'class-transformer';
import { validate } from 'class-validator';

@Injectable()
export class CustomValidationPipe extends ValidationPipe implements PipeTransform {
    constructor(public module: string, public key: string) {
        super();
        this.module = module;
        this.key = key;
    }
    async transform(value: Record<string, unknown>, metaData: ArgumentMetadata):Promise<any> {
        const { metatype } = metaData;
        const object = plainToClass(metatype, value);
        const errors = await validate(object);
        const error_log = {
            error: {
                errors: []
            }
        };
        if (errors.length > 0){
            errors.forEach((error)=>{
                // console.log(error)
                const property_name = error.property; 
                for (const [constraintName, message] of Object.entries(error.constraints)) {
                    error_log.error?.errors?.push({
                        [property_name]: message as string, 
                    })
                    // propertyErrors.push(constraintError);
                    break 
                  }
            })

            console.log(JSON.stringify(error_log))
            return console.log(this.module + '.errors.' + this.key, '400', error_log);
        }
        else{
            return value
        }
    }
    
}

After this, once I make a request to my module as shown in the cURL

curl --location 'http://localhost:3000/v1/frameworks' \
--header 'Content-Type: application/json' \
--data '{
    "name": 1, 
    "description": "description", 
    "sample_object": 1
}'

The ValidationError that is catched is not in the right order. In my case, the field name performs validation in the following order :

  • Is not empty
  • Is string
  • Minimum length
  • Maximum Length

However, the field sample_object is checked in this order :

  • Is Object
  • Is not Empty
  • Validated Nested
  • Type check

The proof of the same is in the error logs as given below :

[
  ValidationError {
    target: AddFrameworkDto {
      name: 1,
      description: 'description',
      sample_object: 1
    },
    value: 1,
    property: 'name',
    children: [],
    constraints: {
      isString: 'is not valid',
      minLength: 'should not be less than 1',
      maxLength: 'should not be more than 255'
    }
  },
  ValidationError {
    target: AddFrameworkDto {
      name: 1,
      description: 'description',
      sample_object: 1
    },
    value: 1,
    property: 'sample_object',
    children: [],
    constraints: {
      isObject: 'is not valid',
      nestedValidation: 'each value in nested property sample_object must be either object or array'
    }
  }
]

Is there a way to consider for this chronology ?

Leave a Comment