Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 11:32
    rhaschke commented #4213
  • 11:21
    umlaeute commented #4213
  • 11:20
    umlaeute commented #4213
  • 10:55
    rhaschke commented #4213
  • 10:00
    kimkulling closed #4213
  • 10:00
    kimkulling commented #4213
  • 09:44
    rhaschke commented #4213
  • 09:29
    umlaeute commented #4213
  • 09:11
    rhaschke commented #4213
  • 08:53
    umlaeute commented #4213
  • Nov 28 22:56
    kimkulling closed #2816
  • Nov 28 22:56
    kimkulling commented #2816
  • Nov 28 22:55
    kimkulling commented #3230
  • Nov 28 22:55
    kimkulling closed #3230
  • Nov 28 22:54
    kimkulling closed #925
  • Nov 28 22:54
    kimkulling commented #925
  • Nov 28 22:52
    kimkulling closed #4198
  • Nov 28 22:52
    kimkulling commented #4198
  • Nov 28 22:51
    kimkulling commented #4193
  • Nov 28 22:51

    kimkulling on master

    Make Blender importer aware of … Add a separate test case for Bl… Fix memory leaks in CollectionO… and 1 more (compare)

iraj mohtasham
@irajsb
@JayChang-zhe if for example your cube has 3 materials it will be actually 3 mesh "sections" ( also in real time rendering it will be 3 draw calls)
JayChang-zhe
@JayChang-zhe
@irajsb Thanks for your reply, one more question, on my side with the library, for example, i have 3 cube with different color: green, red, blue, after i combine three of them to green in the blender, when i try to load this FBX, with green,blue,red material inside, it will only show one cube, but if i remove red material, it will show two cube which is green and red one, why is this?
Benjamin Jillich
@amzn-jillich
Hi everyone, is there a way to access the bind pose transform for a node/bone that is not actually used by any skin influence? aiMesh holds the set of aiBones that are used to deform the mesh, which makes sense. So when gathering all bones from all meshes, we might not get all of them as there might be some bones that do have a different bind pose transform while they are are not assigned to a skin influence. Is there an aiBone for this case and what is the recommended way to access them?
Kim Kulling
@kimkulling
As fdar as I know you can gt the bind-poses from the aNodes via their names
There is a post-processing step how to do that PopulateBindPoses
You shall be able to use this post-process as a template
And Hi :-)
Benjamin Jillich
@amzn-jillich
Do you mean the ArmaturePopulate post-process? If yes, doesn't that have the same issue? - as that is gathering the bones from the meshes only which only returns aiBones for the ones that actually have a skinning influence and deform at least a single vertex. I did not find a post-process related to bind poses.
Kim Kulling
@kimkulling
Maybe, I need to check
Hm, I guess meshes can store also bones, which are not influencing their data. The related node will be referenced via the name stored in the bone. So the storing of bones inside of a mesh is there of historical reasons
does this makes sense to you?
Benjamin Jillich
@amzn-jillich

I see these options:

  1. Store them inside the aiMeshes with aiBones: This results in duplicated data the more meshes we have, but the amount of data should be limited as the aiBones don't take that much memory, right? This would eliminate one advantage with this method though: When passing the inverse bind pose/offset matrices to the GPU, it is exactly that list we want to pass over - from the bones that actually influence the mesh.
  2. Store them on some centralized external place: Like some global bind pose storage where we can query the bind pose transform for all aiNodes. Downside of that though is that we have the offset/bind pose matrices duplicated on two places then.
  3. Add a new (optional?) attribute to the aiNode as the bind pose/offset matrices are not limited to aiBones (even though it is still valuable to have access to the bones via a mesh). Same downside with the duplicated offset matrices.

We can't to store them in the ArmaturePopulate post-process (that might have been a good place?) because that would require the information being available in the scene already.
What do you think?

Kim Kulling
@kimkulling
What do you think about a Skeleton data structure?
Benjamin Jillich
@amzn-jillich
What would that contain? The bind pose matrices and a set of root aiNodes?
Kim Kulling
@kimkulling
And the bone data?
Kim Kulling
@kimkulling
At this moment all the data is stored in the aiMesh. And I think this was in the past the right place. But now with the BindPoses Data this seems to to be the right place for this.
Feedback is welcome!
Benjamin Jillich
@amzn-jillich
That sounds like a great way to me! The ArmaturePopulate post-process functionality could be baked into the Skeleton as well to have an easy way to translate between aiNode and aiBone.
Kim Kulling
@kimkulling
Yws, right
After struggeling so much with USD this would be a more relaxing task :-)
Benjamin Jillich
@amzn-jillich
USD = docs?
That would be lovely and very much appreciated :)
Two more questions actually, O3DE currently is running its own mesh optimizations but we might jump onto elevating what AssImp provides and rather invest into improving that together than maintaining another one.
  1. When there are morph targets and optimizations are running, are the deltas on the morph target remapped to the optimized indices? Are morph targets handled or are these meshes skipped from optimizing?
  2. Are there per-mesh and per LOD settings (like using a single influence for LOD 2, disable vertex welding on one mesh, while having it enabled for the other?) And if not, what of an effort would this be?
paulvirtuel
@paulvirtuel
Hi all,
I just started using Assimp which looks like a great library by the way.
I mainly need it to import rigged/skinned object in collada (.dae) format from makehuman and blender.
So far when importing from makehuman, I got an exception for a vector index out of range in the bone vertex/weight list. At first glance it seems that the vertex index is used to index the bone list instead of the bone index.
I tried loading the same object in blender which looked ok so I re-exported it in collada to try importing it again using Assimp. This time I did not get an exception but when I loop in the list of joints/bones, it seems to have only 25 instead of the 31 from the collada file.
Viktor Kovacs
@kovacsv

Hi All,

I've just released assimpjs (https://github.com/kovacsv/assimpjs), the emscripten compiled version of assimp, that runs entirely in the browser. I'm happy to hear your feedback and ideas. You can join the discussion here: https://github.com/assimp/assimp/discussions/4059

JayChang-zhe
@JayChang-zhe

Hi all,
I have a question about how to get the texture index from aiMaterial? I can get all the embed textures through scene->mTextures[i] and load it in UE4, but i don't really how to map the Texture and materials, especially if some mesh have material(with texture/without texture) and some mesh might not.

Is anyone can help me with this, thanks in advance

Kim Kulling
@kimkulling
There are 2 different options: using embedded textures (these are stored in scene::mTextures)
The second option is to load files from a file system. The difference between them is a starting "*" at the beginning
All meshes have materials, but only some of them have textures
so you need to check if a material has a deficated texture stored in its texture channel. We have different texture channels like diffuse textures or specular textures (mostly used for light maps and so on)
@kovacsv Thanks a lot for sharing that
JayChang-zhe
@JayChang-zhe

@kimkulling Thanks for your reply, in my case, I only use the embed textures(most time only diffuse texture), I get the texture through Scene::mTexture, my problem was i don't know how to get the index from materials, then get the texture from scene::Textures.

After i made some test, i guess i need to use:
aiString texture_file;
mScenePtr->mMaterials[materialIndex]->Get(AI_MATKEY_TEXTURE(aiTextureType_DIFFUSE, 0), texture_file);
then the "texture_file.data" starting "*" will be the texture index inside scene::Texture.

Kim Kulling
@kimkulling
Correct
You can find an example how to do this in the 3MF-Loader
D3DMFImporter, XMLSerializer etc.
JayChang-zhe
@JayChang-zhe
Okay, got it. I will try it later, thanks a lot @kimkulling
Kim Kulling
@kimkulling
You are welcome
AndreiDespinoiu
@AndreiDespinoiu
hi, I'm having some trouble extracting UV transforms from .obj files (using Assimp 5.0.1)
I'm mostly interested in obtaining the UV scale. Here's what I have in the .obj:
map_Kd -s 4.000000 4.000000 1.000000 baseColor.png
and here's how I'm trying to extract it:
aiUVTransform uvTransform;
unsigned int max = sizeof(aiUVTransform) / sizeof(ai_real);
if (AI_SUCCESS != aiGetMaterialFloatArray(mat, AI_MATKEY_UVTRANSFORM(aiTextureType_DIFFUSE, 0),
    (float*)&uvTransform, &max))
{
    std::cout << "Failed to retrieve UV information from material in texture \"" << path
        << "\". Result:" << aiGetMaterialFloatArray(mat, AI_MATKEY_UVTRANSFORM(aiTextureType_DIFFUSE, 0),
        (float*)&uvTransform, &max) << '\n';
}

std::cout << "S=" << uvTransform.mScaling.x       << ", " << uvTransform.mScaling.y
        << "  T=" << uvTransform.mTranslation.x   << ", " << uvTransform.mTranslation.y
        << "  R=" << uvTransform.mRotation        << '\n';
AndreiDespinoiu
@AndreiDespinoiu
this is the output:
Failed to retrieve UV information from material in texture "baseColor.png". Result:-1
S=1, 1 T=0, 0 R=0
AndreiDespinoiu
@AndreiDespinoiu
I compiled the master branch from Github and it seems it doesn't detect texture scaling for Wavefront OBJ files at all, only for glTF 2.0, and with a bug: it says S=30, 30 T=0, 29 R=0. The scaling is correct, except it isn't supposed to have any translation, idk where the "29" comes from:
"extensions": {
    "KHR_texture_transform": {
        "offset": [
            0,
            0
        ],
        "scale": [
            30,
            30
        ],
        "texCoord": 0
    }
}
Chris
@ushort
Hello, I have a mesh and I was wondering how I could translate my animations into assimp. I have a group of vertices (I assume these would be considered bones) and transformations that can be applied to a group of bones such as scaling, rotation, etc. Is there any suggestion or resource on how I can go about this? I noticed aiMesh has mAnimMeshes which I believe is just an array of the translated mesh, but docs say its not in use?
AndreiDespinoiu
@AndreiDespinoiu
stiangglanda®
@stiangglanda
Hi i am currently implementing skeletal animation using directx and i use a left handed coordinate system
stiangglanda®
@stiangglanda
I have two questions first since i need to transpose every matrix (bone offset in my case) do i need to do something to rotation vector 4, position vector 3 and scaling vector 3 maybe swap y and z?
stiangglanda®
@stiangglanda
And second I don't know what the transformation in aiNode does i thought it meight be(for the bones) the bone offset but since the data isn't the same tis can't be true or is it scalingpositionrotation already calculated?
Edit: scaling x position x rotation
Chris
@ushort
@AndreiDespinoiu Thank you, was very informative. Unfortunately, my "bones" do not have a hierarchy. I simply have a group of vertices and the transformations for each frame. Is this supported in assimp?