Skip to content

Commit

Permalink
Merge pull request #61 from pitmonticone/docs-fix-typos
Browse files Browse the repository at this point in the history
[REVIEW]: Fix typos in docs
  • Loading branch information
EtienneCmb authored Oct 16, 2024
2 parents 66d4aaa + 9a1846e commit 0c2fec0
Show file tree
Hide file tree
Showing 5 changed files with 16 additions and 16 deletions.
4 changes: 2 additions & 2 deletions docs/api/api_core.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
``hoi.core``
------------

Core information-theoretical measures and combinatory.
Core information-theoretical measures and combinatorics.

.. currentmodule:: hoi.core

Expand Down Expand Up @@ -29,7 +29,7 @@ Measures of Mutual Information
get_mi


Combinatory
Combinatorics
+++++++++++
.. autosummary::
:toctree: generated/
Expand Down
6 changes: 3 additions & 3 deletions docs/contributor_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ The full installation of HOI includes additional packages to test the software a
Contributing to HOI
-------------------

- For questions, please use the `discusssions page <https://github.com/brainets/hoi/discussions>`_
- For questions, please use the `discussions page <https://github.com/brainets/hoi/discussions>`_
- You can signal a bug or suggests improvements to the code base or the documentation by `opening an issue <https://github.com/brainets/hoi/issues>`_

Contributing code using pull requests
Expand All @@ -45,7 +45,7 @@ We do all of our development using git, so basic knowledge is assumed.

Follow these steps to contribute code:

1. Fork the hoi repository by cliking the **Fork** button on the `repository page <https://github.com/brainets/hoi>`_
1. Fork the hoi repository by clicking the **Fork** button on the `repository page <https://github.com/brainets/hoi>`_
2. Install Python >= 3.8
3. Clone the hoi repository to your computer and install hoi :

Expand Down Expand Up @@ -79,7 +79,7 @@ Follow these steps to contribute code:
pytest -v
Each python file inside HOI is tested to ensure that fonctionalities of HOI are maintained with each commit. If you modify a file, or example `hoi/core/entropies.py`, you can run the tests for this specific file only located in `hoi/core/tests/tests_entropies.py` If you want to only test the files you modified you can use :
Each python file inside HOI is tested to ensure that functionalities of HOI are maintained with each commit. If you modify a file, or example `hoi/core/entropies.py`, you can run the tests for this specific file only located in `hoi/core/tests/tests_entropies.py` If you want to only test the files you modified you can use :

.. code-block:: shell
Expand Down
2 changes: 1 addition & 1 deletion docs/glossary.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Glossary

Network encoding
Higher Order Interactions between a set of variables modulated by a target variable.
Measures of exstrinsic information :cite:`luppi2024information`, i.e. information carried by a group of
Measures of extrinsic information :cite:`luppi2024information`, i.e. information carried by a group of
variables about an external target are part of this group.
`Directed` metrics :cite:`rosas2024characterising`, as the
Redundancy-synergy index (RSI), are also part of this group.
4 changes: 2 additions & 2 deletions docs/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -65,10 +65,10 @@ To inspect your results, we provide a plotting function called :func:`hoi.plot.p
Practical recommendations
+++++++++++++++++++++++++

Robust estimations of HOI strongly rely on the accuity of measuring entropy/mutual information on/between (potentially highly) multivariate data. In the :doc:`auto_examples/index` section you can find benchmarks of our entropy estimators. Here we recommend :
Robust estimations of HOI strongly rely on the acuity of measuring entropy/mutual information on/between (potentially highly) multivariate data. In the :doc:`auto_examples/index` section you can find benchmarks of our entropy estimators. Here we recommend :

* **Measuring entropy and mutual information :** we recommend the Gaussian Copula method (`method="gc"`). Although this measure is not accurate for capturing relationships beyond the gaussian assumption (see :ref:`sphx_glr_auto_examples_it_plot_entropies.py`), this method performs relatively well for multivariate data (see :ref:`sphx_glr_auto_examples_it_plot_entropies_mvar.py`)
* **Measuring Higher-Order Interactions for network behavior and network encoding :** for network behavior and ncoding, we recommend respectively the O-information :class:`hoi.metrics.Oinfo` and the :class:`hoi.metrics.GradientOinfo`. Although both metrics suffer from the same limitations, like the spreading to higher orders, this can be mitigated using a boostrap approach (see :ref:`sphx_glr_auto_examples_statistics_plot_bootstrapping.py`). Otherwise, both metrics are usually pretty accurate to retrieve the type of interactions between variables, especially once combined with the Gaussian Copula.
* **Measuring Higher-Order Interactions for network behavior and network encoding :** for network behavior and encoding, we recommend respectively the O-information :class:`hoi.metrics.Oinfo` and the :class:`hoi.metrics.GradientOinfo`. Although both metrics suffer from the same limitations, like the spreading to higher orders, this can be mitigated using a bootstrap approach (see :ref:`sphx_glr_auto_examples_statistics_plot_bootstrapping.py`). Otherwise, both metrics are usually pretty accurate to retrieve the type of interactions between variables, especially once combined with the Gaussian Copula.


Other softwares for the analysis of higher-order interactions
Expand Down
16 changes: 8 additions & 8 deletions docs/theory.rst
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ entropy of a continuous variable, different methods are implemented in the toolb
discrete set of bins. In this way, variables are discretized and the entropy
can be computed as described above, correcting for the bin size .

* Binning method, that allow to extimate the entropy of a discrete variable
* Binning method, that allow to estimate the entropy of a discrete variable
estimating the probability of each possible values in a frequentist approach.
Note that this procedure can be performed also on continuous variables after
binarization in many different ways
Expand Down Expand Up @@ -126,7 +126,7 @@ MI between two variables, quantifies how much knowing one variable reduces the u
the other and measures the interdependency between the two variables. If they are independent,
we have :math:`H(X,Y)=H(X)+H(Y)`, hence :math:`MI(X,Y)=0`. Since the MI can be reduced to a
signed sum of entropies, the problem of how to estimate MI from continuous data can be
reconducted to the problem, discussed above, of how to estimate entropies. An estimator
reduced to the problem, discussed above, of how to estimate entropies. An estimator
that has been recently developed and presents interesting properties when computing the MI
is the Gaussian Copula estimator :cite:`ince2017statistical`. This estimator is based on the
statistical theory of copulas and is proven to provide a lower bound to the real value of MI,
Expand Down Expand Up @@ -176,7 +176,7 @@ concise explanation and relevant references.
Total correlation
-----------------

Total correlation, :class:`hoi.metrics.TC`, is the oldest exstension of mutual information to
Total correlation, :class:`hoi.metrics.TC`, is the oldest extension of mutual information to
an arbitrary number of variables :cite:`watanabe1960information, studeny1998multiinformation`.
For a group of variables :math:`X^n = \{ X_1, X_2, ..., X_n \}`, it is defined in the following way:

Expand All @@ -185,7 +185,7 @@ For a group of variables :math:`X^n = \{ X_1, X_2, ..., X_n \}`, it is defined
TC(X^{n}) &= \sum_{j=1}^{n} H(X_{j}) - H(X^{n}) \\
The total correlation quantifies the strength of collective constraints ruling the systems,
it is sentive to information shared between single variables and it can be associated with
it is sensitive to information shared between single variables and it can be associated with
redundancy.

.. minigallery:: hoi.metrics.TC
Expand All @@ -195,7 +195,7 @@ Dual Total correlation

Dual total correlation, :class:`hoi.metrics.DTC`, is another extension of mutual information to
an arbitrary number of variables, also known as binding information and excess
entropy, :cite:`sun1975linear`. It quatifies the part of the joint entropy that
entropy, :cite:`sun1975linear`. It quantifies the part of the joint entropy that
is shared by at least two or more variables in the following way:

.. math::
Expand Down Expand Up @@ -223,7 +223,7 @@ correlation (DTC), :cite:`james2011anatomy`:
&= nH(X^{n}) + \sum_{j=1}^{n} [H(X_{j}) + H(
X_{-j}^{n})]
It is sensitive to both redundancy and synergy, quantifying the total ammount of constraints
It is sensitive to both redundancy and synergy, quantifying the total amount of constraints
ruling the system under study.

.. minigallery:: hoi.metrics.Sinfo
Expand Down Expand Up @@ -322,9 +322,9 @@ mutual information (TDMI), :math:`I(X(t-\tau),Y(t-\tau);X(t),Y(t))`.
Network encoding
****************

The metrics that are listed in this section focus on measuring the informaiton
The metrics that are listed in this section focus on measuring the information
content that a set of variables carry about an external target of interest.
Information-theoretic measures, such as Redundacy-Synergy index and the gradient O-information,
Information-theoretic measures, such as Redundancy-Synergy index and the gradient O-information,
are useful for studying the behavior of different variables in relationship with an
external target. Once data is gathered, these measures of network encoding can be applied to unveil
new insights about the functional interaction modulated by external variables of interest.
Expand Down

0 comments on commit 0c2fec0

Please sign in to comment.