7.1 - WHAT? HOW? WHY? 🧐
First things first: what even is a File Assessment? Back in Chapter 2 we introduced the Visual Assessment: an easy and quick way to review rendered images of 3D garments. We introduced some of the basics you should be looking out for when you review a render.
But later in the season, we recommend to open the actual 3D files as well, before you accept or reject a file from a vendor! You want to make sure the file is set up correctly, because many different teams will need to work with the file. From design, all the way to the asset manager who'll eventually upload the final renders for the digital showroom!
So it's important to check 3D files in order to ensure a smooth workflow for all teams across your division. But that's not everything, because the file assessment has a second purpose as well: upskilling! It's important for your vendors to improve their 3D skills:
Do you remember the first time you opened the 3D software? 😨
Did you feel overwhelmed by this new skill you'd have to dive into? 🤯
We certainly did! 🙋♀️
And the exact same thing applies to your vendors: this 3D process is just as new to them as it is to PVH. So providing them with useful feedback on where they can improve, is essential for a smooth 3D process. Vendors improving their skills is something that's beneficial to all of us: yay for higher quality in the future! Win-win!
REVIEWING SAMPLES 🧐
Upon arrival of a physical sample from a vendor, multiple aspects of that garment are checked by different teams across the division. While the pattern & fit teams pull their measurement tapes out to check the sizing of the garment, the PD team is typically checking whether the sample is following the BOM correctly.
Every team has their own responsibilities when it comes to sample reviewing,
... so that's no different for samples delivered in 3D!
So after completing this chapter with all the important things to look out for when reviewing a digital sample, we'd recommend every division to get together and define roles & responsibilities. Think of topics like:
🙆 Which team is responsible for each step of the reviewing process?
🙋 Which criteria are vital to check for the 3D samples of your division?
🤷 Do you review all of the samples in-depth with a full file assessment?
🙅 Or only when the visual assessment indicates major issues in the 3D file?
It could be that some review criteria are essential for one
product group, but useless for another type of product!
So in the next subchapters, we'll start introducing the assessment form, which is basically just a long list with all the different aspects we recommend to check for 3D files. All those small, little things important for a smooth workflow. From a purely visual perspective, to a "ready-for-automation" point of view: allowing the 3D file to be used from end to end.
This chapter will definitely be a very deep dive into the software - compared to all other chapters in this training. But everything will still be from the perspective of reviewing a 3D file, rather than creating a 3D file!
↳ Looking for the latter? Head over to our basic trainings for VStitcher instead! 😎
7.2 - THE ASSESSMENT FORM 📋
Just like the Request form introduced in Chapter 4, we also created a template for your file assessments. You can find the template in the same location, on the Vendor Hub. And just like we mentioned for the request form: we strongly recommend to use it, but in the end it's all up to you! As long as you have some kind of guide or checklist for reviewing files, you'll be good to go. So treat it like an example, not a given ;-)
Three categories... 📊
After listing out all of the different criteria, most of these could easily be divided among three main categories. Some criteria overlapped slightly between multiple categories, but in the end there was always still one winner! These are the three categories:
The 30% - 40% - 30% distribution refers to the weight of each category in the final score calculation of the vendor. In the end all of this is about having those visually appealing renders of the garment, ideally all the way until the digital showroom, right? That's why the Visual category received the almighty extra 10%, more than General & Technical! 🌟
But what does each category stand for? 🤔
⇨ Naming standards
⇨ File organisation
⇨ Follows info provided
Criteria essential for standardisation and automation practices further down the line!
⇨ Colors & Materials
⇨ Simulation Settings
⇨ Simulation Results
What you see is what
you get! Criteria that all significantly affect the visual outcome!
⇨ Patterns & 2D set up
⇨ Arrange & Prepare
⇨ Stitching & Materials
Criteria around best-practices & technical details for convenience in terms of file sharing!
Having said that, it's time to introduce the actual form itself: the template we've been talking about for 6 chapters in a row now! Download the excel file from the Vendor Hub to get access to the interactive features of the form!
Wait, say that again? INTERACTIVE FEATURES? 🤩
Yes, you heard that correctly! As mentioned briefly before: the excel sheet is set up completely with all the necessary formulas to calculate the following:
The total points for each category
The total points of all categories together
The total score in % (taking into account the 30-40-30 distribution)
But that's not all - there is more...
In the fictional assessment form above, a vendor has delivered an outstanding job: the file meets all of the criteria required for the file! Without a doubt, this file deserves to score the full 100%! But the vendor didn't need to create any colorways - that wasn't even part of the request - so what do you do with the criteria concerning colorways? 😩
↳ Just fill in "N.A." and wait for the magic to happen! ✨
What happens? Excel will treat that specific criterion as "Not Applicable", meaning that:
The criterion is crossed out by a strike-through font format
The criterion is greyed out, to emphasise that it's "not important" (for this file)
The total score in % is re-calculated, to ensure a 100% score is still achievable!
It doesn't matter if you use capital letters or not: just make sure you
include a full stop (.) after each letter! So everything works, from
"N.A." to "n.a." or anything in between, but "NA" or "na" won't work.
The last important element of the form we'd like to highlight is the second tab of the form (at the bottom of the excel page). This tab "Additional Screenshots", as the name suggests, includes a basic empty template to add extra screenshots to the list of criteria. Sometimes issues are just easier to explain alongside some visuals, than in text only!
7.3 - THE ASSESSMENT CRITERIA ✅
In this subchapter, we'll introduce the list of criteria that we created. These criteria form the foundation of the assessment form. They're short, concise and simplified in the form itself, but most of them cover many different issues and problems at the same time.
For other criteria you might just be wondering why they deserved a spot in the list: what makes these criteria so important to be worth a full point in the scoring form...?
We will explain each criterion thoroughly, based on the following questions:
WHAT DOES IT MEAN? 🤷
↳ A brief explanation of what it means and what's included in this criterion
WHY IS THIS IMPORTANT? ⚠️
↳ The importance explained by "what's affected when this criterion is not met"
HOW TO RECOGNISE IT? 🧐
↳ What's correct & what's incorrect? Different examples to recognise the issues
IS THERE AN EASY FIX? 🛠️
↳ If there's an easy fix - that can make you the "3D HERO OF THE YEAR" because you fixed the file on time for sketch review - we'll include it here!
"But wait, shouldn't the vendors fix their own files?"
"Why would I correct their mistakes?" 🤔
Of course it's not your job to correct a vendors mistake - they should fix it themselves for sure! But sometimes, it's just a matter of 1 or 2 clicks in the software. So if you know which buttons to click, you can save yourself a few days. No need to wait until the vendor clicks those exact few buttons! Especially when there's an important milestone coming up, and there's no time left to wait for an updated file from the vendor.
BUT! Be careful here!
Even though it is sometimes easy to fix a small mistake, don't forget to communicate the mistake to your vendor! They did not meet our standards and requirements, so no points for that criterion in the assessment form. Make sure to keep communicating every small mistake, so your vendors can keep improving.
Then at the end of each criterion's explanation, we've included some links & referrals:
LEARN MORE 🎓
with STITCH Academy
If the topic is also covered in one of our basic trainings, a link will be included to the corresponding chapter (or to our page with "single topic" how-to videos!)
READ MORE 📚
in the 3D Quality Standards
We will include the chapter of the 3D Apparel Quality Standards with more information about the requirements and standards around this topic.
But wait! ✋
☝️ There's one more thing,
before we dive into the criteria...
It's the Instant Rejection ⛔
There are two main examples of situations where we recommend to reject a file straight away, without even looking at any of the criteria from the list. It's when you get an error like one of these upon opening the .BW file:
↳ The .BW file is saved without the avatar
↳ It's created in a wrong version of VStitcher
In case of the first error, the only thing the vendor needs to do is save the file again, this time with the avatar included. In case of the second error, it's a bigger issue because the vendor will have to build the file again, but this time in the correct version of the software.
🏊♀️ So, let's finally dive deeper into all those criteria now! 🏊♀️
(Click on any criterion to read the full explanations)
Naming standards followed (for BW file and Stylezone upload)
Colorways are named correctly (PLM option code)
Correct pattern piece naming (in English)
File is cleaned up (colors in use, unused materials)
Colorways are consistent in simulation & previews refreshed
Naming & amount of snapshots is correct
All details follow techpack/BOM
Color & Materials
Coloring done correctly (material color codes, blending modes)
Edge shadows applied & correct corner seam alignment in use
Grid standards followed accordingly
Shrink & Force Multiplier have been taken into account
No overuse of 3D styling tools (freeze/flatten)
Garment is balanced on the avatar
Overall simulation quality appropriate (for division and product group)
Arrange & Prepare
Measurements follow size specification
Patterns are complete and correct (e.g. grainlines, layering follows physical)
Symmetry applied everywhere possible
Patterns are logically arranged, don't overlap, and unused pieces are hidden
Correct clusters chosen for each pattern
Placement is correct (centered, overlap where applicable, no intersecting)
Fold lines, 3D layering and collision types set up correctly
All stitching done correctly (+ no stitching to hidden pieces)
Graphics and AOP applied technically correct & according to BOM/sketch
It's a long list. We agree! But keep in mind, that you don't need to check every single criterion for each request. From here on, it's up to you to decide which criteria you consider to be most important for each of your requests.
The importance of a single criterion can be different depending on the division, the product group, but mostly on the request type: blocks, styles, or even including colorways.
The request type indicate the importance of a criterion: a block request usually
won't include any colorways, graphics or AOP - so you can simply enter "N.A."!
But NOS-blocks with logos that never change: make sure to check those logos!
And even though this list is awfully long, it still doesn't cover everything. The criteria cover the most commonly occurring, and the most alarming issues of 3D files. But in the end there's always more, unfortunately... So let's just wrap it up like this:
maybe it's not everything...
But it's still quite an all-encompassing guide! 😎
The last thing to consider, is the information you share with your vendor upon requesting. If you share an existing 3D file with them, this file already includes many different elements like pattern piece naming, symmetry and simulation quality. And if these things are not set up correctly, the vendor can't be held accountable for that.
So make sure you know what you're sending out with a request. If a block/fabric/trim on the hub doesn't meet your own standards, we recommend to check internally with the department responsible for the quality of these assets, before you send out the request!
7.4 - BEST PRACTICES 💡
We have a few more important steps and best-practices for when you're doing a file assessment. Some do's and don'ts to ensure the integrity of the file is untouched!
1. See the garment in 3D after opening the file:
↳ DON'T click prepare and/or dress to open the simulation
↳ DO select the saved snapshot to open the simulation
When you use prepare or dress, it will start simulating from scratch, which means
you lose the existing simulation from the vendor. Secondly, it will start simulating
on the avatar that's last used, not on the avatar saved in the snapshot. So if you
had a Men's style open, close the file and open a Women's style afterwards:
using prepare/dress will simulate the Women's garment on the Men's avatar.
2. Review the arrangement in Prepare mode:
↳ DON'T click on dress after reviewing the prepare mode
↳ DO open the saved snapshot after reviewing the prepare mode
3. Share your feedback to the vendor:
↳ DON'T forget to mention how you reviewed it: visual or file assessment?
↳ DO mention clearly whether you opened the file or only reviewed renders
How you review a style - based on renders (Visual Assessment) or based
on the 3D file (File assessment) - is important to avoid confusion!
For instance: a file has some small mistakes, but this is not visible in the renders.
After a quick visual assessment, you accept the submission. If the vendor
doesn't know that you didn't open the file, they might assume that the mistakes
are 'forgiven'. Or even that they don't need to follow the standards related to
those mistakes anymore - because you accepted it, right? 🤷
It should be clear for each vendor why certain 'mistakes' are sometimes accepted,
and other times rejected - as it depends on whether you opened the file or not ;-)
7.5 - VENDOR RATINGS 🏆
As we already explained in subchapter 7.2, the assessment form gives you the total score as a percentage, visible at the top of the form. This score reflects the vendor's performance based on one single submission at a time. It doesn't give you a clear overview of the overall performance or capabilities of each vendor yet...
...because that ball is in your court again! ⛳
Based on the totals you receive for each category in the assessment form: General, Visual and Technical, you can create your own rating system to track the overall performance and 3D skills of each vendor. Going forward, you can use this information for future decisions about which types of requests you do, or don't want to send to certain vendors.
But we don't think that's something we need to explain to you: that's your expertise! We assume that you already have your own rating system in place, where you keep track of the communication, sample quality, delivery times and any other important data about each vendor, right? So let's add 3D to this data, and track their performance that way! 💥
IT'S YOUR TIME TO SHINE! ✨
You know pretty much e-ve-ry-thing now! Great work on making it this far! Let's put
all that knowledge into practice now, because we all know... Practice makes perfect 🌈