Discussion
Loading...

Post

  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
aeva
@aeva@mastodon.gamedev.place  ·  activity timestamp 2 weeks ago

i made some good progress on my skeleton today

i now have a python script that extracts the data i want from the DICOM files into numpy arrays and then dumps the numpy arrays into a c++ function i wrote to transform the voxel data into a ply pointcloud, which I then import into #blender and use #geometrynodes to trim the scan data down to bones etc.

A screenshot of blender 4.4.1 showing a point cloud of my skeleton.  The point cloud was extracted from a set of DICOM files from a CT scan I had recently on my abdomen.  You can see my skeleton from around the base of my sternum down to my pelvis.  The interface shows that I'm using geometry nodes to process the imported data and the scene takes up 2.09 GiB of ram.
A screenshot of blender 4.4.1 showing a point cloud of my skeleton. The point cloud was extracted from a set of DICOM files from a CT scan I had recently on my abdomen. You can see my skeleton from around the base of my sternum down to my pelvis. The interface shows that I'm using geometry nodes to process the imported data and the scene takes up 2.09 GiB of ram.
A screenshot of blender 4.4.1 showing a point cloud of my skeleton. The point cloud was extracted from a set of DICOM files from a CT scan I had recently on my abdomen. You can see my skeleton from around the base of my sternum down to my pelvis. The interface shows that I'm using geometry nodes to process the imported data and the scene takes up 2.09 GiB of ram.
  • Copy link
  • Flag this post
  • Block
halcy​ :icosahedron:
@halcy@icosahedron.website replied  ·  activity timestamp 2 weeks ago
@aeva god this rules
  • Copy link
  • Flag this comment
  • Block
aeva
@aeva@mastodon.gamedev.place replied  ·  activity timestamp 2 weeks ago
@halcy :D
  • Copy link
  • Flag this comment
  • Block
aeva
@aeva@mastodon.gamedev.place replied  ·  activity timestamp 2 weeks ago

this will be useful for iterating further on cleaning up the data and labeling everything so i can isolate the vertebrae from one another so i can generate cleaner meshes for 3D printing

blender struggles with this data set though, so i am planning on doing most of the bulk processing in c++ and reimporting the data to iterate

  • Copy link
  • Flag this comment
  • Block
Log in

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.0-rc.2.21 no JS en
Automatic federation enabled
  • Explore
  • About
  • Members
  • Code of Conduct
Home
Login