Skip to content

Commit

Permalink
Fix RandLANet docs
Browse files Browse the repository at this point in the history
  • Loading branch information
ssheorey committed Feb 15, 2022
1 parent 8ddb672 commit 3735b7c
Show file tree
Hide file tree
Showing 2 changed files with 30 additions and 24 deletions.
27 changes: 15 additions & 12 deletions ml3d/tf/models/randlanet.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,23 +11,26 @@


class RandLANet(BaseModel):
"""Class defining RandLANet, a Semantic Segmentation model.
Based on the architecture
https://arxiv.org/abs/1911.11236#
"""Class defining RandLANet, a Semantic Segmentation model. Based on the
architecture from the paper `RandLA-Net: Efficient Semantic Segmentation of
Large-Scale Point Clouds <https://arxiv.org/abs/1911.11236>`__.
Reference Implementation - https://github.com/QingyongHu/RandLA-Net
RandLA-Net is an efficient and lightweight neural architecture which
directly infer per-point semantics for large-scale point clouds. The key
approach is to use random point sampling instead of more complex point
selection approaches. Although remarkably computation and memory
efficient, random sampling can discard key features by chance. To overcome
this, we introduce a novel local feature aggregation module to
progressively increase the receptive field for each 3D point, thereby
effectively preserving geometric details.
RandLA-Net is an efficient and lightweight neural architecture which directly infer
per-point semantics for large-scale point clouds. The key approach is to use random
point sampling instead of more complex point selection approaches. Although
remarkably computation and memory efficient, random sampling can discard key features
by chance. To overcome this, we introduce a novel local feature aggregation module to
progressively increase the receptive field for each 3D point, thereby effectively
preserving geometric details.
**Architecture**
Architecture
.. image:: https://user-images.githubusercontent.com/23613902/150006228-34fb9e04-76b6-4022-af08-c308da6dcaae.png
:width: 100%
References:
https://github.com/QingyongHu/RandLA-Net
"""

def __init__(
Expand Down
27 changes: 15 additions & 12 deletions ml3d/torch/models/randlanet.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,23 +15,26 @@


class RandLANet(BaseModel):
"""Class defining RandLANet, a Semantic Segmentation model.
Based on the architecture
https://arxiv.org/abs/1911.11236#
"""Class defining RandLANet, a Semantic Segmentation model. Based on the
architecture from the paper `RandLA-Net: Efficient Semantic Segmentation of
Large-Scale Point Clouds <https://arxiv.org/abs/1911.11236>`__.
Reference Implementation - https://github.com/QingyongHu/RandLA-Net
RandLA-Net is an efficient and lightweight neural architecture which
directly infer per-point semantics for large-scale point clouds. The key
approach is to use random point sampling instead of more complex point
selection approaches. Although remarkably computation and memory
efficient, random sampling can discard key features by chance. To overcome
this, we introduce a novel local feature aggregation module to
progressively increase the receptive field for each 3D point, thereby
effectively preserving geometric details.
RandLA-Net is an efficient and lightweight neural architecture which directly infer
per-point semantics for large-scale point clouds. The key approach is to use random
point sampling instead of more complex point selection approaches. Although
remarkably computation and memory efficient, random sampling can discard key features
by chance. To overcome this, we introduce a novel local feature aggregation module to
progressively increase the receptive field for each 3D point, thereby effectively
preserving geometric details.
**Architecture**
Architecture
.. image:: https://user-images.githubusercontent.com/23613902/150006228-34fb9e04-76b6-4022-af08-c308da6dcaae.png
:width: 100%
References:
https://github.com/QingyongHu/RandLA-Net
"""

def __init__(
Expand Down

0 comments on commit 3735b7c

Please sign in to comment.