Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Automatic restart after library is installed #560

Open
ibobak opened this issue May 28, 2024 · 8 comments
Open

Automatic restart after library is installed #560

ibobak opened this issue May 28, 2024 · 8 comments
Labels
feature-request Request for new features or functionality

Comments

@ibobak
Copy link

ibobak commented May 28, 2024

Type: Bug

  1. clone this project
git clone https://github.com/ibobak/spark_framework.git
  1. open the folder of this project in VS Code
  2. create a new local environment (.venv)
  3. install all requirements
pip install -r requirements.txt 
  1. open the file spark_framework/core.py - you will see a lot of "unable to import pyspark" messages, and this is normal: I forgot to include pyspark in requirements.
  2. OK, let us manually execute
pip install pyspark

The missing module is already installed, however, in editor I still see "unable to import" message.
The package is 100% installed, I can navigate to that code. But pylint doesn't see this - this is the problem.

If it cannot automatically detect new packages, that would be good to have some "force rescan" command or something like that, because this is not obvious, really. In PyCharm such things were done totally automatically.

image
image

Extension version: 2023.10.1
VS Code version: Code 1.89.1 (dc96b837cf6bb4af9cd736aa3af08cf8279f7685, 2024-05-07T05:16:23.416Z)
OS version: Linux x64 6.4.6-060406-generic
Modes:

System Info
Item Value
CPUs Intel(R) Xeon(R) CPU E5-2696 v4 @ 2.20GHz (88 x 2194)
GPU Status 2d_canvas: enabled
canvas_oop_rasterization: disabled_off
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
skia_graphite: disabled_off
video_decode: enabled
video_encode: disabled_software
vulkan: disabled_off
webgl: enabled
webgl2: enabled
webgpu: disabled_off
Load (avg) 2, 2, 2
Memory (System) 251.76GB (92.98GB free)
Process Argv --crash-reporter-id 27d42247-63fb-4d9e-9cb0-87d9974843dc
Screen Reader no
VM 50%
DESKTOP_SESSION ubuntu-xorg
XDG_CURRENT_DESKTOP Unity
XDG_SESSION_DESKTOP ubuntu-xorg
XDG_SESSION_TYPE x11
A/B Experiments
vsliv368:30146709
vspor879:30202332
vspor708:30202333
vspor363:30204092
vscoreces:30445986
vscod805:30301674
binariesv615:30325510
vsaa593:30376534
py29gd2263:31024239
c4g48928:30535728
azure-dev_surveyone:30548225
2i9eh265:30646982
962ge761:30959799
pythongtdpath:30769146
welcomedialog:30910333
pythonidxpt:30866567
pythonnoceb:30805159
asynctok:30898717
pythontestfixt:30902429
pythonregdiag2:30936856
pythonmypyd1:30879173
pythoncet0:30885854
h48ei257:31000450
pythontbext0:30879054
accentitlementst:30995554
dsvsc016:30899300
dsvsc017:30899301
dsvsc018:30899302
cppperfnew:31000557
dsvsc020:30976470
pythonait:31006305
chatpanelt:31048053
dsvsc021:30996838
9c06g630:31013171
pythoncenvpt:31049070
a69g1124:31058053
pythonprt:31056678
dwnewjupyter:31046869
26j00206:31048877

@github-actions github-actions bot added the triage-needed Issue is not triaged. label May 28, 2024
@karthiknadig
Copy link
Member

@ibobak Please provide logs from Output > Pylint.

@karthiknadig karthiknadig added bug Issue identified by VS Code Team member as probable bug info-needed Issue requires more information from poster labels May 28, 2024
@ibobak
Copy link
Author

ibobak commented May 28, 2024

Logs before opening core.py:

2024-05-28 22:49:58.641 [info] Name: Pylint
2024-05-28 22:49:58.641 [info] Module: pylint
2024-05-28 22:49:58.641 [info] Python extension loading
2024-05-28 22:49:58.641 [info] Waiting for interpreter from Python extension.
2024-05-28 22:49:58.641 [info] No interpreter found from setting pylint.interpreter
2024-05-28 22:49:58.641 [info] Getting interpreter from ms-python.python extension for workspace /home/ihor/github/spark_framework
2024-05-28 22:49:58.641 [info] Python extension loaded
2024-05-28 22:49:58.641 [info] Interpreter from ms-python.python extension for /home/ihor/github/spark_framework: /home/ihor/github/spark_framework/.venv/bin/python
2024-05-28 22:49:58.641 [info] Using cwd from `python.analysis.extraPaths`.
2024-05-28 22:49:58.641 [info] No interpreter found from setting pylint.interpreter
2024-05-28 22:49:58.641 [info] Getting interpreter from ms-python.python extension for workspace /home/ihor/github/spark_framework
2024-05-28 22:49:58.641 [info] Interpreter from ms-python.python extension for /home/ihor/github/spark_framework: /home/ihor/github/spark_framework/.venv/bin/python
2024-05-28 22:49:58.641 [info] Using cwd from `python.analysis.extraPaths`.
2024-05-28 22:49:58.641 [info] Server run command: /home/ihor/github/spark_framework/.venv/bin/python /home/ihor/.vscode/extensions/ms-python.pylint-2023.10.1/bundled/tool/lsp_server.py
2024-05-28 22:49:58.641 [info] Server: Start requested.
2024-05-28 22:49:59.329 [info] CWD Server: /home/ihor/github/spark_framework
2024-05-28 22:49:59.329 [info] Settings used to run Server:
[
    {
        "cwd": "/home/ihor/github/spark_framework",
        "workspace": "file:///home/ihor/github/spark_framework",
        "args": [
            "--disable=missing-module-docstring",
            "--disable=missing-class-docstring",
            "--disable=missing-function-docstring",
            "--disable=too-many-arguments",
            "--disable=too-many-locals",
            "--disable=too-many-statements",
            "--disable=unsubscriptable-object",
            "--disable=trailing-whitespace",
            "--disable=unsupported-assignment-operation",
            "--max-line-length=150"
        ],
        "severity": {
            "convention": "Information",
            "error": "Error",
            "fatal": "Error",
            "refactor": "Hint",
            "warning": "Warning",
            "info": "Information"
        },
        "path": [],
        "ignorePatterns": [],
        "interpreter": [
            "/home/ihor/github/spark_framework/.venv/bin/python"
        ],
        "importStrategy": "useBundled",
        "showNotifications": "off",
        "extraPaths": [
            "src"
        ]
    }
]

2024-05-28 22:49:59.329 [info] Global settings:
{
    "cwd": "${workspaceFolder}",
    "workspace": "/home/ihor",
    "args": [
        "--disable=missing-module-docstring",
        "--disable=missing-class-docstring",
        "--disable=missing-function-docstring",
        "--disable=too-many-arguments",
        "--disable=too-many-locals",
        "--disable=too-many-statements",
        "--disable=unsubscriptable-object",
        "--disable=trailing-whitespace",
        "--disable=unsupported-assignment-operation",
        "--max-line-length=150"
    ],
    "severity": {
        "convention": "Information",
        "error": "Error",
        "fatal": "Error",
        "refactor": "Hint",
        "warning": "Warning",
        "info": "Information"
    },
    "path": [],
    "ignorePatterns": [],
    "interpreter": [],
    "importStrategy": "useBundled",
    "showNotifications": "off",
    "extraPaths": []
}

2024-05-28 22:49:59.329 [info] sys.path used to run Server:
   /home/ihor/github/spark_framework
   /home/ihor/.vscode/extensions/ms-python.pylint-2023.10.1/bundled/libs
   /home/ihor/.vscode/extensions/ms-python.pylint-2023.10.1/bundled/tool
   /usr/lib/python310.zip
   /usr/lib/python3.10
   /usr/lib/python3.10/lib-dynload
   /home/ihor/github/spark_framework/.venv/lib/python3.10/site-packages
2024-05-28 22:49:59.332 [info] /home/ihor/github/spark_framework/.venv/bin/python -m pylint --version
2024-05-28 22:49:59.332 [info] CWD Linter: /home/ihor/github/spark_framework
2024-05-28 22:49:59.433 [info] 
pylint 3.0.2
astroid 3.0.1
Python 3.10.13 (main, Sep  5 2023, 06:03:44) [GCC 11.4.0]


2024-05-28 22:49:59.433 [info] Version info for linter running for /home/ihor/github/spark_framework:
pylint 3.0.2
astroid 3.0.1
Python 3.10.13 (main, Sep  5 2023, 06:03:44) [GCC 11.4.0]

2024-05-28 22:49:59.433 [info] SUPPORTED pylint>=2.12.2
FOUND pylint==3.0.2

logs after opening core py (just when I opened it, I got errors; pyspark is still not installed at this moment)

2024-05-28 22:50:29.362 [info] [Trace - 10:50:29 PM] Sending notification 'textDocument/didOpen'.
2024-05-28 22:50:29.450 [info] [Trace - 10:50:29 PM] Received notification 'window/logMessage'.
2024-05-28 22:50:29.451 [info] /home/ihor/github/spark_framework/.venv/bin/python -m pylint --reports=n --output-format=json --disable=missing-module-docstring --disable=missing-class-docstring --disable=missing-function-docstring --disable=too-many-arguments --disable=too-many-locals --disable=too-many-statements --disable=unsubscriptable-object --disable=trailing-whitespace --disable=unsupported-assignment-operation --max-line-length=150 --clear-cache-post-run=y --from-stdin /home/ihor/github/spark_framework/spark_framework/core.py
2024-05-28 22:50:29.454 [info] [Trace - 10:50:29 PM] Received notification 'window/logMessage'.
2024-05-28 22:50:29.455 [info] CWD Linter: /home/ihor/github/spark_framework
2024-05-28 22:50:30.241 [info] [Trace - 10:50:30 PM] Sending request 'textDocument/codeAction - (1)'.
2024-05-28 22:50:30.277 [info] [Trace - 10:50:30 PM] Sending notification '$/cancelRequest'.
2024-05-28 22:50:30.278 [info] [Trace - 10:50:30 PM] Sending request 'textDocument/codeAction - (2)'.
2024-05-28 22:50:30.885 [info] [Trace - 10:50:30 PM] Sending notification '$/cancelRequest'.
2024-05-28 22:50:30.888 [info] [Trace - 10:50:30 PM] Sending request 'textDocument/codeAction - (3)'.
2024-05-28 22:50:34.577 [info] [Trace - 10:50:34 PM] Sending request 'textDocument/codeAction - (4)'.
2024-05-28 22:50:34.578 [info] [Trace - 10:50:34 PM] Sending notification '$/cancelRequest'.
2024-05-28 22:50:41.650 [info] [Trace - 10:50:41 PM] Received notification 'window/logMessage'.
2024-05-28 22:50:41.651 [info] file:///home/ihor/github/spark_framework/spark_framework/core.py :
[
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 398,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (169/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 638,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (167/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 700,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (164/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1141,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (172/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1154,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (153/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1197,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (153/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1315,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (162/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1763,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (153/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1766,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (163/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1840,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (190/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1866,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (152/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1885,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (155/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "too-many-lines",
        "message": "Too many lines in module (2035/1000)",
        "message-id": "C0302"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1126,
        "column": 5,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "fixme",
        "message": "TODO:  think on what to do with columns that are structures with properties, e.g. event_properties.sceneId",
        "message-id": "W0511"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 14,
        "column": 0,
        "endLine": 14,
        "endColumn": 43,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.dataframe'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 15,
        "column": 0,
        "endLine": 15,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.session'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 16,
        "column": 0,
        "endLine": 17,
        "endColumn": 71,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.types'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 18,
        "column": 0,
        "endLine": 18,
        "endColumn": 59,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.functions'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 19,
        "column": 0,
        "endLine": 19,
        "endColumn": 45,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.storagelevel'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 20,
        "column": 0,
        "endLine": 20,
        "endColumn": 30,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 21,
        "column": 0,
        "endLine": 21,
        "endColumn": 34,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.functions'",
        "message-id": "E0401"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "init",
        "line": 35,
        "column": 4,
        "endLine": 35,
        "endColumn": 17,
        "path": "spark_framework/core.py",
        "symbol": "global-statement",
        "message": "Using the global statement",
        "message-id": "W0603"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join_path_char",
        "line": 105,
        "column": 4,
        "endLine": 119,
        "endColumn": 32,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-enumerate",
        "message": "Consider using enumerate instead of iterating with range and len",
        "message-id": "C0200"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "display_pdf",
        "line": 149,
        "column": 8,
        "endLine": 149,
        "endColumn": 67,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'IPython.core.display'",
        "message-id": "E0401"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "display_pdf",
        "line": 149,
        "column": 8,
        "endLine": 149,
        "endColumn": 67,
        "path": "spark_framework/core.py",
        "symbol": "import-outside-toplevel",
        "message": "Import outside toplevel (IPython.core.display.display)",
        "message-id": "C0415"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "get_fs_reader",
        "line": 235,
        "column": 42,
        "endLine": 235,
        "endColumn": 75,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "load",
        "line": 339,
        "column": 7,
        "endLine": 339,
        "endColumn": 53,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-in",
        "message": "Consider merging these comparisons with 'in' by using 'source_type in (C_DFS, C_LOCAL)'. Use a set instead if elements are hashable.",
        "message-id": "R1714"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "load",
        "line": 285,
        "column": 0,
        "endLine": 285,
        "endColumn": 8,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (18/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "get_fs_writer",
        "line": 477,
        "column": 0,
        "endLine": 477,
        "endColumn": 17,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (13/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "save",
        "line": 630,
        "column": 7,
        "endLine": 630,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-in",
        "message": "Consider merging these comparisons with 'in' by using 'dest_type in (C_DFS, C_LOCAL)'. Use a set instead if elements are hashable.",
        "message-id": "R1714"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "save",
        "line": 647,
        "column": 8,
        "endLine": 647,
        "endColumn": 72,
        "path": "spark_framework/core.py",
        "symbol": "notimplemented-raised",
        "message": "NotImplemented raised - should raise NotImplementedError",
        "message-id": "E0711"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "save",
        "line": 647,
        "column": 14,
        "endLine": 647,
        "endColumn": 72,
        "path": "spark_framework/core.py",
        "symbol": "not-callable",
        "message": "NotImplemented is not callable",
        "message-id": "E1102"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "cache",
        "line": 679,
        "column": 42,
        "endLine": 679,
        "endColumn": 80,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "cache",
        "line": 683,
        "column": 42,
        "endLine": 683,
        "endColumn": 58,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "unpersist",
        "line": 696,
        "column": 42,
        "endLine": 696,
        "endColumn": 78,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "unpersist",
        "line": 698,
        "column": 42,
        "endLine": 698,
        "endColumn": 69,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "unpersist",
        "line": 700,
        "column": 42,
        "endLine": 700,
        "endColumn": 163,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "temp_table",
        "line": 718,
        "column": 4,
        "endLine": 718,
        "endColumn": 29,
        "path": "spark_framework/core.py",
        "symbol": "global-statement",
        "message": "Using the global statement",
        "message-id": "W0603"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "sql",
        "line": 736,
        "column": 8,
        "endLine": 739,
        "endColumn": 65,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-enumerate",
        "message": "Consider using enumerate instead of iterating with range and len",
        "message-id": "C0200"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "groupby_sum",
        "line": 852,
        "column": 12,
        "endLine": 852,
        "endColumn": 22,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "groupby",
        "line": 862,
        "column": 0,
        "endLine": 862,
        "endColumn": 11,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (14/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "safe_col_name",
        "line": 942,
        "column": 4,
        "endLine": 946,
        "endColumn": 47,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1010,
        "column": 16,
        "endLine": 1010,
        "endColumn": 50,
        "path": "spark_framework/core.py",
        "symbol": "unnecessary-comprehension",
        "message": "Unnecessary use of a comprehension, use list(a_add.items()) instead.",
        "message-id": "R1721"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1012,
        "column": 12,
        "endLine": 1012,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1014,
        "column": 50,
        "endLine": 1014,
        "endColumn": 60,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1033,
        "column": 18,
        "endLine": 1033,
        "endColumn": 41,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1038,
        "column": 52,
        "endLine": 1038,
        "endColumn": 71,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1040,
        "column": 28,
        "endLine": 1040,
        "endColumn": 40,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1054,
        "column": 24,
        "endLine": 1054,
        "endColumn": 39,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1054,
        "column": 56,
        "endLine": 1054,
        "endColumn": 63,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 955,
        "column": 0,
        "endLine": 955,
        "endColumn": 20,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (16/12)",
        "message-id": "R0912"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "change",
        "line": 1147,
        "column": 42,
        "endLine": 1147,
        "endColumn": 59,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "change",
        "line": 1060,
        "column": 0,
        "endLine": 1060,
        "endColumn": 10,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (29/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "join_clause",
        "line": 1162,
        "column": 0,
        "endLine": 1162,
        "endColumn": 15,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (16/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1262,
        "column": 7,
        "endLine": 1262,
        "endColumn": 42,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-in",
        "message": "Consider merging these comparisons with 'in' by using 'a_how in ('inner', 'left')'. Use a set instead if elements are hashable.",
        "message-id": "R1714"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1272,
        "column": 18,
        "endLine": 1272,
        "endColumn": 26,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1273,
        "column": 19,
        "endLine": 1273,
        "endColumn": 27,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1275,
        "column": 12,
        "endLine": 1289,
        "endColumn": 7,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1314,
        "column": 30,
        "endLine": 1314,
        "endColumn": 40,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1201,
        "column": 0,
        "endLine": 1201,
        "endColumn": 8,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (18/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "calc_stat_local",
        "line": 1456,
        "column": 0,
        "endLine": 1456,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (26/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "show_values",
        "line": 1747,
        "column": 11,
        "endLine": 1747,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "chained-comparison",
        "message": "Simplify chained comparison between the operands",
        "message-id": "R1716"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_missing_key",
        "line": 1777,
        "column": 4,
        "endLine": 1799,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_missing_key",
        "line": 1791,
        "column": 12,
        "endLine": 1791,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_check_duplicates",
        "line": 1810,
        "column": 4,
        "endLine": 1825,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "ut_check_duplicates",
        "line": 1812,
        "column": 18,
        "endLine": 1812,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_check_duplicates",
        "line": 1817,
        "column": 12,
        "endLine": 1817,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_check_dependent_columns",
        "line": 1842,
        "column": 4,
        "endLine": 1861,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "ut_check_dependent_columns",
        "line": 1844,
        "column": 18,
        "endLine": 1844,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_check_dependent_columns",
        "line": 1853,
        "column": 12,
        "endLine": 1853,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_null_check",
        "line": 1867,
        "column": 4,
        "endLine": 1875,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_null_check",
        "line": 1871,
        "column": 12,
        "endLine": 1871,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_schemas_equal",
        "line": 1887,
        "column": 12,
        "endLine": 1887,
        "endColumn": 64,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_schemas_equal",
        "line": 1878,
        "column": 0,
        "endLine": 1878,
        "endColumn": 20,
        "path": "spark_framework/core.py",
        "symbol": "inconsistent-return-statements",
        "message": "Either all return statements in a function should return an expression, or none of them should.",
        "message-id": "R1710"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1898,
        "column": 4,
        "endLine": 1902,
        "endColumn": 68,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1899,
        "column": 15,
        "endLine": 1899,
        "endColumn": 24,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1899,
        "column": 37,
        "endLine": 1899,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1902,
        "column": 15,
        "endLine": 1902,
        "endColumn": 26,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "df_schema_to_str",
        "line": 1910,
        "column": 11,
        "endLine": 1910,
        "endColumn": 20,
        "path": "spark_framework/core.py",
        "symbol": "protected-access",
        "message": "Access to a protected member _jdf of a client class",
        "message-id": "W0212"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "schema_to_code",
        "line": 1927,
        "column": 4,
        "endLine": 1933,
        "endColumn": 35,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"elif\" after \"return\", remove the leading \"el\" from \"elif\"",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "schema_to_code",
        "line": 1931,
        "column": 15,
        "endLine": 1931,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "hide_sensitive_str",
        "line": 1986,
        "column": 27,
        "endLine": 1986,
        "endColumn": 34,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "numpy_to_spark_type",
        "line": 2003,
        "column": 4,
        "endLine": 2022,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"elif\" after \"return\", remove the leading \"el\" from \"elif\"",
        "message-id": "R1705"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "numpy_to_spark_type",
        "line": 2022,
        "column": 8,
        "endLine": 2022,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "numpy_to_spark_type",
        "line": 2002,
        "column": 0,
        "endLine": 2002,
        "endColumn": 23,
        "path": "spark_framework/core.py",
        "symbol": "too-many-return-statements",
        "message": "Too many return statements (9/6)",
        "message-id": "R0911"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 8,
        "column": 0,
        "endLine": 8,
        "endColumn": 29,
        "path": "spark_framework/core.py",
        "symbol": "wrong-import-order",
        "message": "standard import \"from datetime import datetime\" should be placed before \"import numpy as np\"",
        "message-id": "C0411"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 9,
        "column": 0,
        "endLine": 9,
        "endColumn": 9,
        "path": "spark_framework/core.py",
        "symbol": "wrong-import-order",
        "message": "standard import \"import re\" should be placed before \"import numpy as np\"",
        "message-id": "C0411"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 23,
        "column": 0,
        "endLine": 23,
        "endColumn": 40,
        "path": "spark_framework/core.py",
        "symbol": "wrong-import-order",
        "message": "standard import \"from typing import TypeVar, Union, Tuple\" should be placed before \"import numpy as np\"",
        "message-id": "C0411"
    }
]

2024-05-28 22:50:41.655 [info] [Trace - 10:50:41 PM] Received notification 'textDocument/publishDiagnostics'.
2024-05-28 22:50:41.661 [info] [Trace - 10:50:41 PM] Received response 'textDocument/codeAction - (1)' in 11420ms.
2024-05-28 22:50:41.662 [info] [Trace - 10:50:41 PM] Received response 'textDocument/codeAction - (2)' in 11384ms.
2024-05-28 22:50:41.663 [info] [Trace - 10:50:41 PM] Received response 'textDocument/codeAction - (3)' in 10775ms.
2024-05-28 22:50:41.664 [info] [Trace - 10:50:41 PM] Received response 'textDocument/codeAction - (4)' in 7086ms.
2024-05-28 22:50:41.963 [info] [Trace - 10:50:41 PM] Sending request 'textDocument/codeAction - (5)'.
2024-05-28 22:50:41.970 [info] [Trace - 10:50:41 PM] Received response 'textDocument/codeAction - (5)' in 7ms.

Now I am running pip install pyspark and opening logs after this installation finishes:

2024-05-28 22:52:02.640 [info] [Trace - 10:52:02 PM] Sending request 'textDocument/codeAction - (6)'.
2024-05-28 22:52:02.646 [info] [Trace - 10:52:02 PM] Received response 'textDocument/codeAction - (6)' in 6ms.
2024-05-28 22:52:02.683 [info] [Trace - 10:52:02 PM] Sending request 'textDocument/codeAction - (7)'.
2024-05-28 22:52:02.685 [info] [Trace - 10:52:02 PM] Received response 'textDocument/codeAction - (7)' in 2ms.
2024-05-28 22:52:02.713 [info] [Trace - 10:52:02 PM] Sending request 'textDocument/codeAction - (8)'.
2024-05-28 22:52:02.715 [info] [Trace - 10:52:02 PM] Received response 'textDocument/codeAction - (8)' in 2ms.
2024-05-28 22:52:17.192 [info] [Trace - 10:52:17 PM] Sending request 'textDocument/codeAction - (9)'.
2024-05-28 22:52:17.195 [info] [Trace - 10:52:17 PM] Received response 'textDocument/codeAction - (9)' in 3ms.
2024-05-28 22:52:21.588 [info] [Trace - 10:52:21 PM] Sending request 'textDocument/codeAction - (10)'.
2024-05-28 22:52:21.590 [info] [Trace - 10:52:21 PM] Received response 'textDocument/codeAction - (10)' in 2ms.

At this moment in core.py I still see errors (while I should NOT see them, because pyspark is installed).

now I am closing core.py, re-opening it agan -

024-05-28 22:53:44.427 [info] [Trace - 10:53:44 PM] Sending notification 'textDocument/didClose'.
2024-05-28 22:53:44.432 [info] [Trace - 10:53:44 PM] Received notification 'textDocument/publishDiagnostics'.
2024-05-28 22:53:48.799 [info] [Trace - 10:53:48 PM] Sending notification 'textDocument/didOpen'.
2024-05-28 22:53:48.844 [info] [Trace - 10:53:48 PM] Sending request 'textDocument/codeAction - (11)'.
2024-05-28 22:53:48.847 [info] [Trace - 10:53:48 PM] Received notification 'window/logMessage'.
2024-05-28 22:53:48.848 [info] /home/ihor/github/spark_framework/.venv/bin/python -m pylint --reports=n --output-format=json --disable=missing-module-docstring --disable=missing-class-docstring --disable=missing-function-docstring --disable=too-many-arguments --disable=too-many-locals --disable=too-many-statements --disable=unsubscriptable-object --disable=trailing-whitespace --disable=unsupported-assignment-operation --max-line-length=150 --clear-cache-post-run=y --from-stdin /home/ihor/github/spark_framework/spark_framework/core.py
2024-05-28 22:53:48.853 [info] [Trace - 10:53:48 PM] Received notification 'window/logMessage'.
2024-05-28 22:53:48.853 [info] CWD Linter: /home/ihor/github/spark_framework
2024-05-28 22:53:49.264 [info] [Trace - 10:53:49 PM] Sending request 'textDocument/codeAction - (12)'.
2024-05-28 22:53:49.265 [info] [Trace - 10:53:49 PM] Sending notification '$/cancelRequest'.
2024-05-28 22:54:01.701 [info] [Trace - 10:54:01 PM] Received notification 'window/logMessage'.
2024-05-28 22:54:01.701 [info] file:///home/ihor/github/spark_framework/spark_framework/core.py :
[
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 398,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (169/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 638,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (167/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 700,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (164/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1141,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (172/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1154,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (153/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1197,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (153/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1315,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (162/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1763,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (153/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1766,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (163/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1840,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (190/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1866,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (152/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1885,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "line-too-long",
        "message": "Line too long (155/150)",
        "message-id": "C0301"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1,
        "column": 0,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "too-many-lines",
        "message": "Too many lines in module (2035/1000)",
        "message-id": "C0302"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "",
        "line": 1126,
        "column": 5,
        "endLine": null,
        "endColumn": null,
        "path": "spark_framework/core.py",
        "symbol": "fixme",
        "message": "TODO:  think on what to do with columns that are structures with properties, e.g. event_properties.sceneId",
        "message-id": "W0511"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 14,
        "column": 0,
        "endLine": 14,
        "endColumn": 43,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.dataframe'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 15,
        "column": 0,
        "endLine": 15,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.session'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 16,
        "column": 0,
        "endLine": 17,
        "endColumn": 71,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.types'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 18,
        "column": 0,
        "endLine": 18,
        "endColumn": 59,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.functions'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 19,
        "column": 0,
        "endLine": 19,
        "endColumn": 45,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.storagelevel'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 20,
        "column": 0,
        "endLine": 20,
        "endColumn": 30,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql'",
        "message-id": "E0401"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "",
        "line": 21,
        "column": 0,
        "endLine": 21,
        "endColumn": 34,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'pyspark.sql.functions'",
        "message-id": "E0401"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "init",
        "line": 35,
        "column": 4,
        "endLine": 35,
        "endColumn": 17,
        "path": "spark_framework/core.py",
        "symbol": "global-statement",
        "message": "Using the global statement",
        "message-id": "W0603"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join_path_char",
        "line": 105,
        "column": 4,
        "endLine": 119,
        "endColumn": 32,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-enumerate",
        "message": "Consider using enumerate instead of iterating with range and len",
        "message-id": "C0200"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "display_pdf",
        "line": 149,
        "column": 8,
        "endLine": 149,
        "endColumn": 67,
        "path": "spark_framework/core.py",
        "symbol": "import-error",
        "message": "Unable to import 'IPython.core.display'",
        "message-id": "E0401"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "display_pdf",
        "line": 149,
        "column": 8,
        "endLine": 149,
        "endColumn": 67,
        "path": "spark_framework/core.py",
        "symbol": "import-outside-toplevel",
        "message": "Import outside toplevel (IPython.core.display.display)",
        "message-id": "C0415"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "get_fs_reader",
        "line": 235,
        "column": 42,
        "endLine": 235,
        "endColumn": 75,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "load",
        "line": 339,
        "column": 7,
        "endLine": 339,
        "endColumn": 53,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-in",
        "message": "Consider merging these comparisons with 'in' by using 'source_type in (C_DFS, C_LOCAL)'. Use a set instead if elements are hashable.",
        "message-id": "R1714"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "load",
        "line": 285,
        "column": 0,
        "endLine": 285,
        "endColumn": 8,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (18/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "get_fs_writer",
        "line": 477,
        "column": 0,
        "endLine": 477,
        "endColumn": 17,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (13/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "save",
        "line": 630,
        "column": 7,
        "endLine": 630,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-in",
        "message": "Consider merging these comparisons with 'in' by using 'dest_type in (C_DFS, C_LOCAL)'. Use a set instead if elements are hashable.",
        "message-id": "R1714"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "save",
        "line": 647,
        "column": 8,
        "endLine": 647,
        "endColumn": 72,
        "path": "spark_framework/core.py",
        "symbol": "notimplemented-raised",
        "message": "NotImplemented raised - should raise NotImplementedError",
        "message-id": "E0711"
    },
    {
        "type": "error",
        "module": "spark_framework.core",
        "obj": "save",
        "line": 647,
        "column": 14,
        "endLine": 647,
        "endColumn": 72,
        "path": "spark_framework/core.py",
        "symbol": "not-callable",
        "message": "NotImplemented is not callable",
        "message-id": "E1102"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "cache",
        "line": 679,
        "column": 42,
        "endLine": 679,
        "endColumn": 80,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "cache",
        "line": 683,
        "column": 42,
        "endLine": 683,
        "endColumn": 58,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "unpersist",
        "line": 696,
        "column": 42,
        "endLine": 696,
        "endColumn": 78,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "unpersist",
        "line": 698,
        "column": 42,
        "endLine": 698,
        "endColumn": 69,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "unpersist",
        "line": 700,
        "column": 42,
        "endLine": 700,
        "endColumn": 163,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "temp_table",
        "line": 718,
        "column": 4,
        "endLine": 718,
        "endColumn": 29,
        "path": "spark_framework/core.py",
        "symbol": "global-statement",
        "message": "Using the global statement",
        "message-id": "W0603"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "sql",
        "line": 736,
        "column": 8,
        "endLine": 739,
        "endColumn": 65,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-enumerate",
        "message": "Consider using enumerate instead of iterating with range and len",
        "message-id": "C0200"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "groupby_sum",
        "line": 852,
        "column": 12,
        "endLine": 852,
        "endColumn": 22,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "groupby",
        "line": 862,
        "column": 0,
        "endLine": 862,
        "endColumn": 11,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (14/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "safe_col_name",
        "line": 942,
        "column": 4,
        "endLine": 946,
        "endColumn": 47,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1010,
        "column": 16,
        "endLine": 1010,
        "endColumn": 50,
        "path": "spark_framework/core.py",
        "symbol": "unnecessary-comprehension",
        "message": "Unnecessary use of a comprehension, use list(a_add.items()) instead.",
        "message-id": "R1721"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1012,
        "column": 12,
        "endLine": 1012,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1014,
        "column": 50,
        "endLine": 1014,
        "endColumn": 60,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1033,
        "column": 18,
        "endLine": 1033,
        "endColumn": 41,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1038,
        "column": 52,
        "endLine": 1038,
        "endColumn": 71,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1040,
        "column": 28,
        "endLine": 1040,
        "endColumn": 40,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1054,
        "column": 24,
        "endLine": 1054,
        "endColumn": 39,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 1054,
        "column": 56,
        "endLine": 1054,
        "endColumn": 63,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "get_change_query",
        "line": 955,
        "column": 0,
        "endLine": 955,
        "endColumn": 20,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (16/12)",
        "message-id": "R0912"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "change",
        "line": 1147,
        "column": 42,
        "endLine": 1147,
        "endColumn": 59,
        "path": "spark_framework/core.py",
        "symbol": "f-string-without-interpolation",
        "message": "Using an f-string that does not have any interpolated variables",
        "message-id": "W1309"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "change",
        "line": 1060,
        "column": 0,
        "endLine": 1060,
        "endColumn": 10,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (29/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "join_clause",
        "line": 1162,
        "column": 0,
        "endLine": 1162,
        "endColumn": 15,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (16/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1262,
        "column": 7,
        "endLine": 1262,
        "endColumn": 42,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-in",
        "message": "Consider merging these comparisons with 'in' by using 'a_how in ('inner', 'left')'. Use a set instead if elements are hashable.",
        "message-id": "R1714"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1272,
        "column": 18,
        "endLine": 1272,
        "endColumn": 26,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1273,
        "column": 19,
        "endLine": 1273,
        "endColumn": 27,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1275,
        "column": 12,
        "endLine": 1289,
        "endColumn": 7,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1314,
        "column": 30,
        "endLine": 1314,
        "endColumn": 40,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "join",
        "line": 1201,
        "column": 0,
        "endLine": 1201,
        "endColumn": 8,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (18/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "calc_stat_local",
        "line": 1456,
        "column": 0,
        "endLine": 1456,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "too-many-branches",
        "message": "Too many branches (26/12)",
        "message-id": "R0912"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "show_values",
        "line": 1747,
        "column": 11,
        "endLine": 1747,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "chained-comparison",
        "message": "Simplify chained comparison between the operands",
        "message-id": "R1716"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_missing_key",
        "line": 1777,
        "column": 4,
        "endLine": 1799,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_missing_key",
        "line": 1791,
        "column": 12,
        "endLine": 1791,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_check_duplicates",
        "line": 1810,
        "column": 4,
        "endLine": 1825,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "ut_check_duplicates",
        "line": 1812,
        "column": 18,
        "endLine": 1812,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_check_duplicates",
        "line": 1817,
        "column": 12,
        "endLine": 1817,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_check_dependent_columns",
        "line": 1842,
        "column": 4,
        "endLine": 1861,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "ut_check_dependent_columns",
        "line": 1844,
        "column": 18,
        "endLine": 1844,
        "endColumn": 33,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_check_dependent_columns",
        "line": 1853,
        "column": 12,
        "endLine": 1853,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_null_check",
        "line": 1867,
        "column": 4,
        "endLine": 1875,
        "endColumn": 19,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_null_check",
        "line": 1871,
        "column": 12,
        "endLine": 1871,
        "endColumn": 49,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "ut_schemas_equal",
        "line": 1887,
        "column": 12,
        "endLine": 1887,
        "endColumn": 64,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "ut_schemas_equal",
        "line": 1878,
        "column": 0,
        "endLine": 1878,
        "endColumn": 20,
        "path": "spark_framework/core.py",
        "symbol": "inconsistent-return-statements",
        "message": "Either all return statements in a function should return an expression, or none of them should.",
        "message-id": "R1710"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1898,
        "column": 4,
        "endLine": 1902,
        "endColumn": 68,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"else\" after \"return\", remove the \"else\" and de-indent the code inside it",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1899,
        "column": 15,
        "endLine": 1899,
        "endColumn": 24,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1899,
        "column": 37,
        "endLine": 1899,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "row_to_str",
        "line": 1902,
        "column": 15,
        "endLine": 1902,
        "endColumn": 26,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "df_schema_to_str",
        "line": 1910,
        "column": 11,
        "endLine": 1910,
        "endColumn": 20,
        "path": "spark_framework/core.py",
        "symbol": "protected-access",
        "message": "Access to a protected member _jdf of a client class",
        "message-id": "W0212"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "schema_to_code",
        "line": 1927,
        "column": 4,
        "endLine": 1933,
        "endColumn": 35,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"elif\" after \"return\", remove the leading \"el\" from \"elif\"",
        "message-id": "R1705"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "schema_to_code",
        "line": 1931,
        "column": 15,
        "endLine": 1931,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "hide_sensitive_str",
        "line": 1986,
        "column": 27,
        "endLine": 1986,
        "endColumn": 34,
        "path": "spark_framework/core.py",
        "symbol": "consider-using-f-string",
        "message": "Formatting a regular string which could be an f-string",
        "message-id": "C0209"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "numpy_to_spark_type",
        "line": 2003,
        "column": 4,
        "endLine": 2022,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "no-else-return",
        "message": "Unnecessary \"elif\" after \"return\", remove the leading \"el\" from \"elif\"",
        "message-id": "R1705"
    },
    {
        "type": "warning",
        "module": "spark_framework.core",
        "obj": "numpy_to_spark_type",
        "line": 2022,
        "column": 8,
        "endLine": 2022,
        "endColumn": 44,
        "path": "spark_framework/core.py",
        "symbol": "broad-exception-raised",
        "message": "Raising too general exception: Exception",
        "message-id": "W0719"
    },
    {
        "type": "refactor",
        "module": "spark_framework.core",
        "obj": "numpy_to_spark_type",
        "line": 2002,
        "column": 0,
        "endLine": 2002,
        "endColumn": 23,
        "path": "spark_framework/core.py",
        "symbol": "too-many-return-statements",
        "message": "Too many return statements (9/6)",
        "message-id": "R0911"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 8,
        "column": 0,
        "endLine": 8,
        "endColumn": 29,
        "path": "spark_framework/core.py",
        "symbol": "wrong-import-order",
        "message": "standard import \"from datetime import datetime\" should be placed before \"import numpy as np\"",
        "message-id": "C0411"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 9,
        "column": 0,
        "endLine": 9,
        "endColumn": 9,
        "path": "spark_framework/core.py",
        "symbol": "wrong-import-order",
        "message": "standard import \"import re\" should be placed before \"import numpy as np\"",
        "message-id": "C0411"
    },
    {
        "type": "convention",
        "module": "spark_framework.core",
        "obj": "",
        "line": 23,
        "column": 0,
        "endLine": 23,
        "endColumn": 40,
        "path": "spark_framework/core.py",
        "symbol": "wrong-import-order",
        "message": "standard import \"from typing import TypeVar, Union, Tuple\" should be placed before \"import numpy as np\"",
        "message-id": "C0411"
    }
]

2024-05-28 22:54:01.704 [info] [Trace - 10:54:01 PM] Received notification 'textDocument/publishDiagnostics'.
2024-05-28 22:54:01.706 [info] [Trace - 10:54:01 PM] Received response 'textDocument/codeAction - (11)' in 12862ms.
2024-05-28 22:54:01.706 [info] [Trace - 10:54:01 PM] Received response 'textDocument/codeAction - (12)' in 12442ms.
2024-05-28 22:54:02.049 [info] [Trace - 10:54:02 PM] Sending request 'textDocument/codeAction - (13)'.
2024-05-28 22:54:02.052 [info] [Trace - 10:54:02 PM] Received response 'textDocument/codeAction - (13)' in 3ms.

And it looks like the problem has gone.

But here is where the problem is: as soon as I installed "pip install pyspark", those open editors which are using it, HAVE NO IDEA that their code is referencing an existing library, until I close them and re-open again.

is is possible to automatically re-launch pylint after pip install is done?

@ibobak
Copy link
Author

ibobak commented May 28, 2024

I got one more related bad behavior: here is the video https://youtu.be/fZo5GL1veLg

@ibobak
Copy link
Author

ibobak commented May 28, 2024

This is the whole log which was got during this video
pylint.txt

@karthiknadig karthiknadig self-assigned this May 28, 2024
@karthiknadig
Copy link
Member

@ibobak We don't plan on doing this automatically. Monitoring package installs and large folder trees can cause significant performance issues. You can use the Pylint: Re-start command from the command pallet instead of closing and re-opening the editor.

I will leave this open, in case we have alternative ways to do this.

@karthiknadig karthiknadig changed the title "Unable to import" after library is installed Automatic restart after library is installed Jun 12, 2024
@karthiknadig karthiknadig removed their assignment Jun 12, 2024
@karthiknadig karthiknadig added feature-request Request for new features or functionality and removed bug Issue identified by VS Code Team member as probable bug triage-needed Issue is not triaged. info-needed Issue requires more information from poster labels Jun 12, 2024
@ibobak
Copy link
Author

ibobak commented Jun 12, 2024

@karthiknadig I would like to kindly ask you to watch this short video: https://youtu.be/fZo5GL1veLg

I've shown how to reproduce the error:

image

@ibobak
Copy link
Author

ibobak commented Jun 12, 2024

My point is this: the lines are either errorneous, or they are correct.

But on the video I am showing that they are first not marked (=correct), but just after "go to declaration" and "back" they suddenly become errorneous.

@karthiknadig
Copy link
Member

@ibobak The one in the video seems like a different problem. It seems like there is some kind of cacheing issue. Can you try using the latest pre-release version of the extension which has the latest pylint?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request Request for new features or functionality
Projects
None yet
Development

No branches or pull requests

2 participants