31 Commits

Author SHA1 Message Date
Félix Boisselier
c7e39da528 Merge pull request #23 from Frix-x/venv
v2.0.0
2023-12-11 15:41:31 +01:00
Félix Boisselier
1a0ee0a162 Merge branch 'main' into venv 2023-12-11 15:41:05 +01:00
Félix Boisselier
87cb9015fa updated documentation 2023-12-11 14:39:21 +00:00
Félix Boisselier
b32abe2eca belt differential spectrogram now show the impact of each belt with its own color 2023-12-10 12:47:59 +01:00
Félix Boisselier
7050018274 fixed #7 and code optimizations 2023-12-10 06:40:08 +01:00
Félix Boisselier
8721488d8c added axes_map computation 2023-12-08 17:57:50 +01:00
Félix Boisselier
7ba692954f some cleaning 2023-11-29 16:12:40 +01:00
Félix Boisselier
9ce3677a00 Update spelling in documentation (#20)
Co-authored-by: Wayne Manion <treowayne@gmail.com>
2023-11-29 16:01:29 +01:00
Félix Boisselier
3a9cb57f31 splitted cfg files to allow a selection of only some of the macros 2023-11-29 12:04:59 +01:00
Félix Boisselier
43a205d036 now using a venv to run the scripts 2023-11-29 11:43:21 +01:00
Félix Boisselier
a1e9269ba3 Merge pull request #16 from Frix-x/accel-patch
Allow user input for ACCEL on vibration measurements
and use a low accel by default (with automated restore of the previous values at the end of the test) to get proper measurements
2023-11-27 23:36:35 +01:00
Félix Boisselier
8e304a71ca fixed typo in axis selection for EXCITATE_AXIS_AT_FREQ 2023-11-27 20:44:44 +01:00
Félix Boisselier
5d54db0ca0 Merge pull request #15 from Frix-x/img-link
improved documentation UX with images links
Also added a long banner to avoid cluttering space when it's not needed (in the documentation)
2023-11-27 17:44:49 +01:00
Félix Boisselier
d52680738f documentation images as links 2023-11-27 17:42:33 +01:00
Félix Boisselier
f95c55230b added proper management of the vibration test accels 2023-11-27 17:07:20 +01:00
Fragmon
0f7fa66af4 Update IS_vibrations_measurements.cfg (#14)
Co-authored-by: Félix Boisselier <accounts@fboisselier.fr>
2023-11-27 15:51:28 +01:00
Félix Boisselier
da10593ca7 added filesystem sync and file handler checks to avoid going too fast with corrupted CSVs 2023-11-27 15:08:04 +01:00
Félix Boisselier
060a800cc3 revert a4c2ead and add a PermissionError check instead 2023-11-24 17:09:12 +01:00
Félix Boisselier
7c76be5077 Merge pull request #9 from Frix-x/filedescriptors-fix
using fcntl to check if a file is still open by klipper
2023-11-20 09:39:04 +01:00
Félix Boisselier
a4c2ead732 using fcntl to check if a file is still open by klipper 2023-11-19 18:33:47 +01:00
Félix Boisselier
6e884528c0 Merge pull request #6 from Frix-x/develop
replaced TwoSlopNorm by a custom norm
to allow older version of matplotlib to be used
2023-11-06 22:34:22 +01:00
Félix Boisselier
17ccddfa0f replaced TwoSlopNorm by a custom norm 2023-11-06 22:33:02 +01:00
Félix Boisselier
83f517758a Merge pull request #4 from Frix-x/develop
v1.1.1
2023-11-01 20:09:50 +01:00
Félix Boisselier
c156459420 updated the low vibration shaper detection logic to avoid unusable choices 2023-11-01 20:08:58 +01:00
Félix Boisselier
5366ad0581 Merge pull request #3 from Frix-x/develop
modified the low vibration shaper recommendation mechanism
2023-10-31 22:35:08 +01:00
Félix Boisselier
77bfc7ca42 Merge branch 'main' into develop 2023-10-31 22:34:22 +01:00
Félix Boisselier
ce0330a9d1 modified the low vibration shaper recommendation 2023-10-31 22:23:06 +01:00
Félix Boisselier
358773ddef Merge pull request #2 from Frix-x/develop
Localisation fix and additional safety checks
2023-10-28 14:11:16 +02:00
Félix Boisselier
d0930261f7 removed symbols in console prints 2023-10-28 14:09:59 +02:00
Félix Boisselier
a03a3c2e4b Added some safety checks and forced C locale for console printing 2023-10-27 14:44:06 +02:00
Félix Boisselier
c102d4145c fixed MHI LUT to give values on all the range 2023-10-26 18:52:34 +02:00
25 changed files with 880 additions and 439 deletions

160
.gitignore vendored Normal file
View File

@@ -0,0 +1,160 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/#use-with-ide
.pdm.toml
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/

View File

@@ -1,97 +0,0 @@
################################################
###### STANDARD INPUT_SHAPER CALIBRATIONS ######
################################################
# Written by Frix_x#0161 #
# @version: 1.4
# CHANGELOG:
# v1.4: added possibility to only run one axis at a time for the axes shaper calibration
# v1.3: added possibility to override the default parameters
# v1.2: added EXCITATE_AXIS_AT_FREQ to hold a specific excitating frequency on an axis and diagnose mechanical problems
# v1.1: added M400 to validate that the files are correctly saved to disk
# v1.0: first version of the automatic input shaper workflow
### What is it ? ###
# This macro helps you to configure the input shaper algorithm of Klipper by running the tests sequencially and calling an automatic script
# that generate the graphs, manage the files and so on. It's basically a fully automatic input shaper calibration workflow.
# Results can be found in your config folder using FLuidd/Maisail file manager.
# The goal is to make it easy to set, share and use it.
# Usage:
# 1. Call the AXES_SHAPER_CALIBRATION macro, wait for it to end and compute the graphs. Then look for the results in the results folder.
# 2. Call the BELTS_SHAPER_CALIBRATION macro, wait for it to end and compute the graphs. Then look for the results in the results folder.
# 3. If you find out some strange noise, you can use the EXCITATE_AXIS_AT_FREQ macro to diagnose the origin
[gcode_macro AXES_SHAPER_CALIBRATION]
description: Run standard input shaper test for all axes
gcode:
{% set verbose = params.VERBOSE|default(true) %}
{% set min_freq = params.FREQ_START|default(5)|float %}
{% set max_freq = params.FREQ_END|default(133.3)|float %}
{% set hz_per_sec = params.HZ_PER_SEC|default(1)|float %}
{% set axis = params.AXIS|default("all")|string|lower %}
{% set X, Y = False, False %}
{% if axis == "all" %}
{% set X, Y = True, True %}
{% elif axis == "x" %}
{% set X = True %}
{% elif axis == "y" %}
{% set Y = True %}
{% else %}
{ action_raise_error("AXIS selection invalid. Should be either all, x or y!") }
{% endif %}
{% if X %}
TEST_RESONANCES AXIS=X OUTPUT=raw_data NAME=x FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
{% if verbose %}
RESPOND MSG="X axis shaper graphs generation..."
{% endif %}
RUN_SHELL_COMMAND CMD=plot_graph PARAMS=SHAPER
{% endif %}
{% if Y %}
TEST_RESONANCES AXIS=Y OUTPUT=raw_data NAME=y FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
{% if verbose %}
RESPOND MSG="Y axis shaper graphs generation..."
{% endif %}
RUN_SHELL_COMMAND CMD=plot_graph PARAMS=SHAPER
{% endif %}
[gcode_macro BELTS_SHAPER_CALIBRATION]
description: Run custom demi-axe test to analyze belts on CoreXY printers
gcode:
{% set verbose = params.VERBOSE|default(true) %}
{% set min_freq = params.FREQ_START|default(5)|float %}
{% set max_freq = params.FREQ_END|default(133.33)|float %}
{% set hz_per_sec = params.HZ_PER_SEC|default(1)|float %}
TEST_RESONANCES AXIS=1,1 OUTPUT=raw_data NAME=b FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
TEST_RESONANCES AXIS=1,-1 OUTPUT=raw_data NAME=a FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
{% if verbose %}
RESPOND MSG="Belts graphs generation..."
{% endif %}
RUN_SHELL_COMMAND CMD=plot_graph PARAMS=BELTS
[gcode_macro EXCITATE_AXIS_AT_FREQ]
description: Maintain a specified input shaper excitating frequency for some time to diagnose vibrations
gcode:
{% set FREQUENCY = params.FREQUENCY|default(25)|int %}
{% set TIME = params.TIME|default(10)|int %}
{% set AXIS = params.AXIS|default("x")|string|lower %}
TEST_RESONANCES OUTPUT=raw_data AXIS={AXIS} FREQ_START={FREQUENCY-1} FREQ_END={FREQUENCY+1} HZ_PER_SEC={1/(TIME/3)}
M400

View File

@@ -1,4 +0,0 @@
[gcode_shell_command plot_graph]
command: ~/printer_data/config/K-ShakeTune/scripts/is_workflow.py
timeout: 600.0
verbose: True

View File

@@ -0,0 +1,60 @@
############################################################
###### AXE_MAP DETECTION AND ACCELEROMETER VALIDATION ######
############################################################
# Written by Frix_x#0161 #
[gcode_macro AXES_MAP_CALIBRATION]
gcode:
{% set z_height = params.Z_HEIGHT|default(20)|int %} # z height to put the toolhead before starting the movements
{% set speed = params.SPEED|default(80)|float * 60 %} # feedrate for the movements
{% set accel = params.ACCEL|default(1500)|int %} # accel value used to move on the pattern
{% set feedrate_travel = params.TRAVEL_SPEED|default(120)|int * 60 %} # travel feedrate between moves
{% set accel_chip = params.ACCEL_CHIP|default("adxl345") %} # ADXL chip name in the config
{% set mid_x = printer.toolhead.axis_maximum.x|float / 2 %}
{% set mid_y = printer.toolhead.axis_maximum.y|float / 2 %}
{% set accel = [accel, printer.configfile.settings.printer.max_accel]|min %}
{% set old_accel = printer.toolhead.max_accel %}
{% set old_accel_to_decel = printer.toolhead.max_accel_to_decel %}
{% set old_sqv = printer.toolhead.square_corner_velocity %}
{% if not 'xyz' in printer.toolhead.homed_axes %}
{ action_raise_error("Must Home printer first!") }
{% endif %}
{action_respond_info("")}
{action_respond_info("Starting accelerometer axe_map calibration")}
{action_respond_info("This operation can not be interrupted by normal means. Hit the \"emergency stop\" button to stop it if needed")}
{action_respond_info("")}
SAVE_GCODE_STATE NAME=STATE_AXESMAP_CALIBRATION
G90
# Set the wanted acceleration values (not too high to avoid oscillation, not too low to be able to reach constant speed on each segments)
SET_VELOCITY_LIMIT ACCEL={accel} ACCEL_TO_DECEL={accel} SQUARE_CORNER_VELOCITY={[(accel / 1000), 5.0]|max}
# Going to the start position
G1 Z{z_height} F{feedrate_travel / 8}
G1 X{mid_x - 15} Y{mid_y - 15} F{feedrate_travel}
G4 P500
ACCELEROMETER_MEASURE CHIP={accel_chip}
G4 P1000 # This first waiting time is to record the background accelerometer noise before moving
G1 X{mid_x + 15} F{speed}
G4 P1000
G1 Y{mid_y + 15} F{speed}
G4 P1000
G1 Z{z_height + 15} F{speed}
G4 P1000
ACCELEROMETER_MEASURE CHIP={accel_chip} NAME=axemap
RESPOND MSG="Analysis of the movements..."
RUN_SHELL_COMMAND CMD=shaketune PARAMS="AXESMAP {accel}"
# Restore the previous acceleration values
SET_VELOCITY_LIMIT ACCEL={old_accel} ACCEL_TO_DECEL={old_accel_to_decel} SQUARE_CORNER_VELOCITY={old_sqv}
RESTORE_GCODE_STATE NAME=STATE_AXESMAP_CALIBRATION

View File

@@ -0,0 +1,42 @@
################################################
###### STANDARD INPUT_SHAPER CALIBRATIONS ######
################################################
# Written by Frix_x#0161 #
[gcode_macro AXES_SHAPER_CALIBRATION]
description: Perform standard axis input shaper tests on one or both XY axes to select the best input shaper filter
gcode:
{% set min_freq = params.FREQ_START|default(5)|float %}
{% set max_freq = params.FREQ_END|default(133.3)|float %}
{% set hz_per_sec = params.HZ_PER_SEC|default(1)|float %}
{% set axis = params.AXIS|default("all")|string|lower %}
{% set X, Y = False, False %}
{% if axis == "all" %}
{% set X, Y = True, True %}
{% elif axis == "x" %}
{% set X = True %}
{% elif axis == "y" %}
{% set Y = True %}
{% else %}
{ action_raise_error("AXIS selection invalid. Should be either all, x or y!") }
{% endif %}
{% if X %}
TEST_RESONANCES AXIS=X OUTPUT=raw_data NAME=x FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
RESPOND MSG="X axis frequency profile generation..."
RESPOND MSG="This may take some time (1-3min)"
RUN_SHELL_COMMAND CMD=shaketune PARAMS=SHAPER
{% endif %}
{% if Y %}
TEST_RESONANCES AXIS=Y OUTPUT=raw_data NAME=y FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
RESPOND MSG="Y axis frequency profile generation..."
RESPOND MSG="This may take some time (1-3min)"
RUN_SHELL_COMMAND CMD=shaketune PARAMS=SHAPER
{% endif %}

View File

@@ -0,0 +1,21 @@
################################################
###### STANDARD INPUT_SHAPER CALIBRATIONS ######
################################################
# Written by Frix_x#0161 #
[gcode_macro BELTS_SHAPER_CALIBRATION]
description: Perform a custom half-axis test to analyze and compare the frequency profiles of individual belts on CoreXY printers
gcode:
{% set min_freq = params.FREQ_START|default(5)|float %}
{% set max_freq = params.FREQ_END|default(133.33)|float %}
{% set hz_per_sec = params.HZ_PER_SEC|default(1)|float %}
TEST_RESONANCES AXIS=1,1 OUTPUT=raw_data NAME=b FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
TEST_RESONANCES AXIS=1,-1 OUTPUT=raw_data NAME=a FREQ_START={min_freq} FREQ_END={max_freq} HZ_PER_SEC={hz_per_sec}
M400
RESPOND MSG="Belts comparative frequency profile generation..."
RESPOND MSG="This may take some time (3-5min)"
RUN_SHELL_COMMAND CMD=shaketune PARAMS=BELTS

View File

@@ -0,0 +1,24 @@
################################################
###### STANDARD INPUT_SHAPER CALIBRATIONS ######
################################################
# Written by Frix_x#0161 #
[gcode_macro EXCITATE_AXIS_AT_FREQ]
description: Maintain a specified excitation frequency for a period of time to diagnose and locate a source of vibration
gcode:
{% set frequency = params.FREQUENCY|default(25)|int %}
{% set time = params.TIME|default(10)|int %}
{% set axis = params.AXIS|default("x")|string|lower %}
{% if axis not in ["x", "y", "a", "b"] %}
{ action_raise_error("AXIS selection invalid. Should be either x, y, a or b!") }
{% endif %}
{% if axis == "a" %}
{% set axis = "1,-1" %}
{% elif axis == "b" %}
{% set axis = "1,1" %}
{% endif %}
TEST_RESONANCES OUTPUT=raw_data AXIS={axis} FREQ_START={frequency-1} FREQ_END={frequency+1} HZ_PER_SEC={1/(time/3)}
M400

View File

@@ -2,59 +2,29 @@
###### VIBRATIONS AND SPEED OPTIMIZATIONS ###### ###### VIBRATIONS AND SPEED OPTIMIZATIONS ######
################################################ ################################################
# Written by Frix_x#0161 # # Written by Frix_x#0161 #
# @version: 2.1
# CHANGELOG:
# v2.1: allow decimal entries for speed and increment and added the E axis as an option to be neasured
# v2.0: added the possibility to measure mutliple axis
# v1.0: first speed and vibrations optimization macro
### What is it ? ###
# This macro helps you to identify the speed settings that exacerbate the vibrations of the machine (ie. where the frame resonate badly).
# It also helps to find the clean speed ranges where the machine is silent.
# I had some strong vibrations at very specific speeds on my machine (52mm/s for example) and I wanted to find all these problematic speeds
# to avoid them in my slicer profile and finally get the silent machine I was dreaming!
# It works by moving the toolhead at different speed settings while recording the vibrations using the ADXL chip. Then the macro call a custom script
# to compute and find the best speed settings. The results can be found in your config folder using Fluidd/Mainsail file manager.
# The goal is to make it easy to set, share and use it.
# This macro is parametric and most of the values can be adjusted with their respective input parameters.
# It can be called without any parameters - in which case the default values would be used - or with any combination of parameters as desired.
# Usage:
# 1. DO YOUR INPUT SHAPER CALIBRATION FIRST !!! This macro should not be used before as it would be useless and the results invalid.
# 2. Call the VIBRATIONS_CALIBRATION macro with the speed range you want to measure (default 20 to 200mm/s with 2mm/s increment).
# Be carefull about the Z_HEIGHT variable that default to 20mm -> if your ADXL is under the nozzle, increase it to avoid a crash of the ADXL on the bed of the machine.
# 3. Wait for it to finish all the measurement and compute the graph. Then look at it in the results folder.
[gcode_macro VIBRATIONS_CALIBRATION] [gcode_macro VIBRATIONS_CALIBRATION]
gcode: gcode:
#
# PARAMETERS
#
{% set size = params.SIZE|default(60)|int %} # size of the area where the movements are done {% set size = params.SIZE|default(60)|int %} # size of the area where the movements are done
{% set direction = params.DIRECTION|default('XY') %} # can be set to either XY, AB, ABXY, A, B, X, Y, Z {% set direction = params.DIRECTION|default('XY') %} # can be set to either XY, AB, ABXY, A, B, X, Y, Z
{% set z_height = params.Z_HEIGHT|default(20)|int %} # z height to put the toolhead before starting the movements {% set z_height = params.Z_HEIGHT|default(20)|int %} # z height to put the toolhead before starting the movements
{% set verbose = params.VERBOSE|default(true) %} # Wether to log the current speed in the console
{% set min_speed = params.MIN_SPEED|default(20)|float * 60 %} # minimum feedrate for the movements {% set min_speed = params.MIN_SPEED|default(20)|float * 60 %} # minimum feedrate for the movements
{% set max_speed = params.MAX_SPEED|default(200)|float * 60 %} # maximum feedrate for the movements {% set max_speed = params.MAX_SPEED|default(200)|float * 60 %} # maximum feedrate for the movements
{% set speed_increment = params.SPEED_INCREMENT|default(2)|float * 60 %} # feedrate increment between each move {% set speed_increment = params.SPEED_INCREMENT|default(2)|float * 60 %} # feedrate increment between each move
{% set feedrate_travel = params.TRAVEL_SPEED|default(200)|int * 60 %} # travel feedrate between moves {% set feedrate_travel = params.TRAVEL_SPEED|default(200)|int * 60 %} # travel feedrate between moves
{% set accel = params.ACCEL|default(3000)|int %} # accel value used to move on the pattern
{% set accel_chip = params.ACCEL_CHIP|default("adxl345") %} # ADXL chip name in the config {% set accel_chip = params.ACCEL_CHIP|default("adxl345") %} # ADXL chip name in the config
#
# COMPUTED VALUES
#
{% set mid_x = printer.toolhead.axis_maximum.x|float / 2 %} {% set mid_x = printer.toolhead.axis_maximum.x|float / 2 %}
{% set mid_y = printer.toolhead.axis_maximum.y|float / 2 %} {% set mid_y = printer.toolhead.axis_maximum.y|float / 2 %}
{% set nb_samples = ((max_speed - min_speed) / speed_increment + 1) | int %} {% set nb_samples = ((max_speed - min_speed) / speed_increment + 1) | int %}
{% set accel = [accel, printer.configfile.settings.printer.max_accel]|min %}
{% set old_accel = printer.toolhead.max_accel %}
{% set old_accel_to_decel = printer.toolhead.max_accel_to_decel %}
{% set old_sqv = printer.toolhead.square_corner_velocity %}
{% set direction_factor = { {% set direction_factor = {
'XY' : { 'XY' : {
'start' : {'x': -0.5, 'y': -0.5 }, 'start' : {'x': -0.5, 'y': -0.5 },
@@ -129,9 +99,7 @@ gcode:
} }
%} %}
#
# STARTING...
#
{% if not 'xyz' in printer.toolhead.homed_axes %} {% if not 'xyz' in printer.toolhead.homed_axes %}
{ action_raise_error("Must Home printer first!") } { action_raise_error("Must Home printer first!") }
{% endif %} {% endif %}
@@ -155,19 +123,19 @@ gcode:
SAVE_GCODE_STATE NAME=STATE_VIBRATIONS_CALIBRATION SAVE_GCODE_STATE NAME=STATE_VIBRATIONS_CALIBRATION
M83
G90 G90
# Set the wanted acceleration values (not too high to avoid oscillation, not too low to be able to reach constant speed on each segments)
SET_VELOCITY_LIMIT ACCEL={accel} ACCEL_TO_DECEL={accel} SQUARE_CORNER_VELOCITY={[(accel / 1000), 5.0]|max}
# Going to the start position # Going to the start position
G1 Z{z_height} G1 Z{z_height} F{feedrate_travel / 10}
G1 X{mid_x + (size * direction_factor[direction].start.x) } Y{mid_y + (size * direction_factor[direction].start.y)} F{feedrate_travel} G1 X{mid_x + (size * direction_factor[direction].start.x) } Y{mid_y + (size * direction_factor[direction].start.y)} F{feedrate_travel}
# vibration pattern for each frequency # vibration pattern for each frequency
{% for curr_sample in range(0, nb_samples) %} {% for curr_sample in range(0, nb_samples) %}
{% set curr_speed = min_speed + curr_sample * speed_increment %} {% set curr_speed = min_speed + curr_sample * speed_increment %}
{% if verbose %} RESPOND MSG="{"Current speed: %.2f mm/s" % (curr_speed / 60)|float}"
RESPOND MSG="{"Current speed: %.2f mm/s" % (curr_speed / 60)|float}"
{% endif %}
ACCELEROMETER_MEASURE CHIP={accel_chip} ACCELEROMETER_MEASURE CHIP={accel_chip}
{% if direction == 'E' %} {% if direction == 'E' %}
@@ -178,14 +146,16 @@ gcode:
{% endfor %} {% endfor %}
{% endif %} {% endif %}
ACCELEROMETER_MEASURE CHIP={accel_chip} NAME=sp{("%.2f" % (curr_speed / 60)|float)|replace('.','_')}n1 ACCELEROMETER_MEASURE CHIP={accel_chip} NAME=sp{("%.2f" % (curr_speed / 60)|float)|replace('.','_')}n1
G4 P300 G4 P300
M400 M400
{% endfor %} {% endfor %}
{% if verbose %} RESPOND MSG="Machine and motors vibration graph generation..."
RESPOND MSG="Graphs generation... Please wait a minute or two and look in the configured folder." RESPOND MSG="This may take some time (3-5min)"
{% endif %} RUN_SHELL_COMMAND CMD=shaketune PARAMS="VIBRATIONS {direction}"
RUN_SHELL_COMMAND CMD=plot_graph PARAMS="VIBRATIONS {direction}"
# Restore the previous acceleration values
SET_VELOCITY_LIMIT ACCEL={old_accel} ACCEL_TO_DECEL={old_accel_to_decel} SQUARE_CORNER_VELOCITY={old_sqv}
RESTORE_GCODE_STATE NAME=STATE_VIBRATIONS_CALIBRATION RESTORE_GCODE_STATE NAME=STATE_VIBRATIONS_CALIBRATION

View File

@@ -0,0 +1,171 @@
#!/usr/bin/env python3
######################################
###### AXE_MAP DETECTION SCRIPT ######
######################################
# Written by Frix_x#0161 #
# Be sure to make this script executable using SSH: type 'chmod +x ./analyze_axesmap.py' when in the folder !
#####################################################################
################ !!! DO NOT EDIT BELOW THIS LINE !!! ################
#####################################################################
import optparse
import numpy as np
import locale
from scipy.signal import butter, filtfilt
NUM_POINTS = 500
# Set the best locale for time and date formating (generation of the titles)
try:
locale.setlocale(locale.LC_TIME, locale.getdefaultlocale())
except locale.Error:
locale.setlocale(locale.LC_TIME, 'C')
# Override the built-in print function to avoid problem in Klipper due to locale settings
original_print = print
def print_with_c_locale(*args, **kwargs):
original_locale = locale.setlocale(locale.LC_ALL, None)
locale.setlocale(locale.LC_ALL, 'C')
original_print(*args, **kwargs)
locale.setlocale(locale.LC_ALL, original_locale)
print = print_with_c_locale
######################################################################
# Computation
######################################################################
def accel_signal_filter(data, cutoff=2, fs=100, order=5):
nyq = 0.5 * fs
normal_cutoff = cutoff / nyq
b, a = butter(order, normal_cutoff, btype='low', analog=False)
filtered_data = filtfilt(b, a, data)
filtered_data -= np.mean(filtered_data)
return filtered_data
def find_first_spike(data):
min_index, max_index = np.argmin(data), np.argmax(data)
return ('-', min_index) if min_index < max_index else ('', max_index)
def get_movement_vector(data, start_idx, end_idx):
if start_idx < end_idx:
vector = []
for i in range(3):
vector.append(np.mean(data[i][start_idx:end_idx], axis=0))
return vector
else:
return np.zeros(3)
def angle_between(v1, v2):
v1_u = v1 / np.linalg.norm(v1)
v2_u = v2 / np.linalg.norm(v2)
return np.arccos(np.clip(np.dot(v1_u, v2_u), -1.0, 1.0))
def compute_errors(filtered_data, spikes_sorted, accel_value, num_points):
# Get the movement start points in the correct order from the sorted bag of spikes
movement_starts = [spike[0][1] for spike in spikes_sorted]
# Theoretical unit vectors for X, Y, Z printer axes
printer_axes = {
'x': np.array([1, 0, 0]),
'y': np.array([0, 1, 0]),
'z': np.array([0, 0, 1])
}
alignment_errors = {}
sensitivity_errors = {}
for i, axis in enumerate(['x', 'y', 'z']):
movement_start = movement_starts[i]
movement_end = movement_start + num_points
movement_vector = get_movement_vector(filtered_data, movement_start, movement_end)
alignment_errors[axis] = angle_between(movement_vector, printer_axes[axis])
measured_accel_magnitude = np.linalg.norm(movement_vector)
if accel_value != 0:
sensitivity_errors[axis] = abs(measured_accel_magnitude - accel_value) / accel_value * 100
else:
sensitivity_errors[axis] = None
return alignment_errors, sensitivity_errors
######################################################################
# Startup and main routines
######################################################################
def parse_log(logname):
with open(logname) as f:
for header in f:
if not header.startswith('#'):
break
if not header.startswith('freq,psd_x,psd_y,psd_z,psd_xyz'):
# Raw accelerometer data
return np.loadtxt(logname, comments='#', delimiter=',')
# Power spectral density data or shaper calibration data
raise ValueError("File %s does not contain raw accelerometer data and therefore "
"is not supported by this script. Please use the official Klipper "
"calibrate_shaper.py script to process it instead." % (logname,))
def axesmap_calibration(lognames, accel=None):
# Parse the raw data and get them ready for analysis
raw_datas = [parse_log(filename) for filename in lognames]
if len(raw_datas) > 1:
raise ValueError("Analysis of multiple CSV files at once is not possible with this script")
filtered_data = [accel_signal_filter(raw_datas[0][:, i+1]) for i in range(3)]
spikes = [find_first_spike(filtered_data[i]) for i in range(3)]
spikes_sorted = sorted([(spikes[0], 'x'), (spikes[1], 'y'), (spikes[2], 'z')], key=lambda x: x[0][1])
# Using the previous variables to get the axes_map and errors
axes_map = ','.join([f"{spike[0][0]}{spike[1]}" for spike in spikes_sorted])
# alignment_error, sensitivity_error = compute_errors(filtered_data, spikes_sorted, accel, NUM_POINTS)
results = f"Detected axes_map:\n {axes_map}\n"
# TODO: work on this function that is currently not giving good results...
# results += "Accelerometer angle deviation:\n"
# for axis, angle in alignment_error.items():
# angle_degrees = np.degrees(angle) # Convert radians to degrees
# results += f" {axis.upper()} axis: {angle_degrees:.2f} degrees\n"
# results += "Accelerometer sensitivity error:\n"
# for axis, error in sensitivity_error.items():
# results += f" {axis.upper()} axis: {error:.2f}%\n"
return results
def main():
# Parse command-line arguments
usage = "%prog [options] <raw logs>"
opts = optparse.OptionParser(usage)
opts.add_option("-o", "--output", type="string", dest="output",
default=None, help="filename of output graph")
opts.add_option("-a", "--accel", type="string", dest="accel",
default=None, help="acceleration value used to do the movements")
options, args = opts.parse_args()
if len(args) < 1:
opts.error("No CSV file(s) to analyse")
if options.accel is None:
opts.error("You must specify the acceleration value used when generating the CSV file (option -a)")
try:
accel_value = float(options.accel)
except ValueError:
opts.error("Invalid acceleration value. It should be a numeric value.")
results = axesmap_calibration(args, accel_value)
print(results)
if options.output is not None:
with open(options.output, 'w') as f:
f.write(results)
if __name__ == '__main__':
main()

View File

@@ -4,12 +4,6 @@
######## CoreXY BELTS CALIBRATION SCRIPT ######## ######## CoreXY BELTS CALIBRATION SCRIPT ########
################################################# #################################################
# Written by Frix_x#0161 # # Written by Frix_x#0161 #
# @version: 2.0
# CHANGELOG:
# v2.0: updated the script to align it to the new K-Shake&Tune module
# v1.0: first version of this tool for enhanced vizualisation of belt graphs
# Be sure to make this script executable using SSH: type 'chmod +x ./graph_belts.py' when in the folder! # Be sure to make this script executable using SSH: type 'chmod +x ./graph_belts.py' when in the folder!
@@ -18,9 +12,9 @@
##################################################################### #####################################################################
import optparse, matplotlib, sys, importlib, os import optparse, matplotlib, sys, importlib, os
from textwrap import wrap
from collections import namedtuple from collections import namedtuple
import numpy as np import numpy as np
import scipy
import matplotlib.pyplot, matplotlib.dates, matplotlib.font_manager import matplotlib.pyplot, matplotlib.dates, matplotlib.font_manager
import matplotlib.ticker, matplotlib.gridspec, matplotlib.colors import matplotlib.ticker, matplotlib.gridspec, matplotlib.colors
import matplotlib.patches import matplotlib.patches
@@ -28,10 +22,6 @@ import locale
from datetime import datetime from datetime import datetime
matplotlib.use('Agg') matplotlib.use('Agg')
try:
locale.setlocale(locale.LC_TIME, locale.getdefaultlocale())
except locale.Error:
locale.setlocale(locale.LC_TIME, 'C')
ALPHABET = "ABCDEFGHIJKLMNOPQRSTUVWXYZ" # For paired peaks names ALPHABET = "ABCDEFGHIJKLMNOPQRSTUVWXYZ" # For paired peaks names
@@ -54,6 +44,22 @@ KLIPPAIN_COLORS = {
} }
# Set the best locale for time and date formating (generation of the titles)
try:
locale.setlocale(locale.LC_TIME, locale.getdefaultlocale())
except locale.Error:
locale.setlocale(locale.LC_TIME, 'C')
# Override the built-in print function to avoid problem in Klipper due to locale settings
original_print = print
def print_with_c_locale(*args, **kwargs):
original_locale = locale.setlocale(locale.LC_ALL, None)
locale.setlocale(locale.LC_ALL, 'C')
original_print(*args, **kwargs)
locale.setlocale(locale.LC_ALL, original_locale)
print = print_with_c_locale
###################################################################### ######################################################################
# Computation of the PSD graph # Computation of the PSD graph
###################################################################### ######################################################################
@@ -159,144 +165,61 @@ def pair_peaks(peaks1, freqs1, psd1, peaks2, freqs2, psd2):
def compute_spectrogram(data): def compute_spectrogram(data):
N = data.shape[0] N = data.shape[0]
Fs = N / (data[-1,0] - data[0,0]) Fs = N / (data[-1, 0] - data[0, 0])
# Round up to a power of 2 for faster FFT # Round up to a power of 2 for faster FFT
M = 1 << int(.5 * Fs - 1).bit_length() M = 1 << int(.5 * Fs - 1).bit_length()
window = np.kaiser(M, 6.) window = np.kaiser(M, 6.)
def _specgram(x):
return matplotlib.mlab.specgram(
x, Fs=Fs, NFFT=M, noverlap=M//2, window=window,
mode='psd', detrend='mean', scale_by_freq=False)
d = {'x': data[:,1], 'y': data[:,2], 'z': data[:,3]} def _specgram(x):
pdata, bins, t = _specgram(d['x']) x_detrended = x - np.mean(x) # Detrending by subtracting the mean value
for ax in 'yz': return scipy.signal.spectrogram(
pdata += _specgram(d[ax])[0] x_detrended, fs=Fs, window=window, nperseg=M, noverlap=M//2,
return pdata, bins, t detrend='constant', scaling='density', mode='psd')
d = {'x': data[:, 1], 'y': data[:, 2], 'z': data[:, 3]}
f, t, pdata = _specgram(d['x'])
for axis in 'yz':
pdata += _specgram(d[axis])[2]
return pdata, t, f
###################################################################### ######################################################################
# Computation of the differential spectrogram # Computation of the differential spectrogram
###################################################################### ######################################################################
# Performs a standard bilinear interpolation for a given x, y point based on surrounding input grid values. This function # Interpolate source_data (2D) to match target_x and target_y in order to
# is part of the logic to re-align both belts spectrogram in order to combine them in the differential spectrogram.
def bilinear_interpolate(x, y, points, values):
x1, x2 = points[0]
y1, y2 = points[1]
f11, f12 = values[0]
f21, f22 = values[1]
interpolated_value = (
(f11 * (x2 - x) * (y2 - y) +
f21 * (x - x1) * (y2 - y) +
f12 * (x2 - x) * (y - y1) +
f22 * (x - x1) * (y - y1)) / ((x2 - x1) * (y2 - y1))
)
return interpolated_value
# Interpolate source_data (2D) to match target_x and target_y in order to interpolate and
# get similar time and frequency dimensions for the differential spectrogram # get similar time and frequency dimensions for the differential spectrogram
def interpolate_2d(target_x, target_y, source_x, source_y, source_data): def interpolate_2d(target_x, target_y, source_x, source_y, source_data):
interpolated_data = np.zeros((len(target_y), len(target_x))) # Create a grid of points in the source and target space
source_points = np.array([(x, y) for y in source_y for x in source_x])
target_points = np.array([(x, y) for y in target_y for x in target_x])
for i, y in enumerate(target_y): # Flatten the source data to match the flattened source points
for j, x in enumerate(target_x): source_values = source_data.flatten()
# Find indices of surrounding points in source data
# and ensure we don't exceed array bounds
x_indices = np.searchsorted(source_x, x) - 1
y_indices = np.searchsorted(source_y, y) - 1
x_indices = max(0, min(len(source_x) - 1, x_indices))
y_indices = max(0, min(len(source_y) - 1, y_indices))
if x_indices == len(source_x) - 2: # Interpolate and reshape the interpolated data to match the target grid shape and replace NaN with zeros
x_indices -= 1 interpolated_data = scipy.interpolate.griddata(source_points, source_values, target_points, method='nearest')
if y_indices == len(source_y) - 2: interpolated_data = interpolated_data.reshape((len(target_y), len(target_x)))
y_indices -= 1 interpolated_data = np.nan_to_num(interpolated_data)
x1, x2 = source_x[x_indices], source_x[x_indices + 1]
y1, y2 = source_y[y_indices], source_y[y_indices + 1]
f11 = source_data[y_indices, x_indices]
f12 = source_data[y_indices, x_indices + 1]
f21 = source_data[y_indices + 1, x_indices]
f22 = source_data[y_indices + 1, x_indices + 1]
interpolated_data[i, j] = bilinear_interpolate(x, y, ((x1, x2), (y1, y2)), ((f11, f12), (f21, f22)))
return interpolated_data return interpolated_data
# This function identifies a "ridge" of high gradient magnitude in a spectrogram (pdata) - ie. a resonance diagonal line. Starting from # Main logic function to combine two similar spectrogram - ie. from both belts paths - by substracting signals in order to create
# the maximum value in the first column, it iteratively follows the direction of the highest gradient in the vicinity (window configured using # a new composite spectrogram. This result of a divergent but mostly centered new spectrogram (center will be white) with some colored zones
# the n_average parameter). The result is a sequence of indices that traces the resonance line across the original spectrogram. # highlighting differences in the belts paths. The summative spectrogram is used for the MHI calculation.
def detect_ridge(pdata, n_average=3):
grad_y, grad_x = np.gradient(pdata)
magnitude = np.sqrt(grad_x**2 + grad_y**2)
# Start at the maximum value in the first column
start_idx = np.argmax(pdata[:, 0])
path = [start_idx]
# Walk through the spectrogram following the path of the ridge
for j in range(1, pdata.shape[1]):
# Look in the vicinity of the previous point
vicinity = magnitude[max(0, path[-1]-n_average):min(pdata.shape[0], path[-1]+n_average+1), j]
# Take an average of top few points
sorted_indices = np.argsort(vicinity)
top_indices = sorted_indices[-n_average:]
next_idx = int(np.mean(top_indices) + max(0, path[-1]-n_average))
path.append(next_idx)
return np.array(path)
# This function calculates the time offset between two resonances lines (ridge1 and ridge2) using cross-correlation in
# the frequency domain (using FFT). The result provides the lag (or offset) at which the two sequences are most similar.
# This is used to re-align both belts spectrograms on their resonances lines in order to create the combined spectrogram.
def compute_cross_correlation_offset(ridge1, ridge2):
# Ensure that the two arrays have the same shape
if len(ridge1) < len(ridge2):
ridge1 = np.pad(ridge1, (0, len(ridge2) - len(ridge1)))
elif len(ridge1) > len(ridge2):
ridge2 = np.pad(ridge2, (0, len(ridge1) - len(ridge2)))
cross_corr = np.fft.fftshift(np.fft.ifft(np.fft.fft(ridge1) * np.conj(np.fft.fft(ridge2))))
return np.argmax(np.abs(cross_corr)) - len(ridge1) // 2
# This function shifts data along its second dimension - ie. time here - by a specified shift_amount
def shift_data_in_time(data, shift_amount):
if shift_amount > 0:
return np.pad(data, ((0, 0), (shift_amount, 0)), mode='constant')[:, :-shift_amount]
elif shift_amount < 0:
return np.pad(data, ((0, 0), (0, -shift_amount)), mode='constant')[:, -shift_amount:]
else:
return data
# Main logic function to combine two similar spectrogram - ie. from both belts paths - by detecting similarities (ridges), computing
# the time lag and realigning them. Finally this function combine (by substracting signals) the aligned spectrograms in a new one.
# This result of a mostly zero-ed new spectrogram with some colored zones highlighting differences in the belts paths.
def combined_spectrogram(data1, data2): def combined_spectrogram(data1, data2):
pdata1, bins1, t1 = compute_spectrogram(data1) pdata1, bins1, t1 = compute_spectrogram(data1)
pdata2, _, _ = compute_spectrogram(data2) pdata2, bins2, t2 = compute_spectrogram(data2)
# Detect ridges # Interpolate the spectrograms
ridge1 = detect_ridge(pdata1) pdata2_interpolated = interpolate_2d(bins1, t1, bins2, t2, pdata2)
ridge2 = detect_ridge(pdata2)
# Compute offset using cross-correlation and shit/align and interpolate the spectrograms # Cobine them in two form: a summed diff for the MHI computation and a diverging diff for the spectrogram colors
offset = compute_cross_correlation_offset(ridge1, ridge2) combined_sum = np.abs(pdata1 - pdata2_interpolated)
pdata2_aligned = shift_data_in_time(pdata2, offset) combined_divergent = pdata1 - pdata2_interpolated
pdata2_interpolated = interpolate_2d(t1, bins1, t1, bins1, pdata2_aligned)
# Combine the spectrograms return combined_sum, combined_divergent, bins1, t1
combined_data = np.abs(pdata1 - pdata2_interpolated)
return combined_data, bins1, t1
# Compute a composite and highly subjective value indicating the "mechanical health of the printer (0 to 100%)" that represent the # Compute a composite and highly subjective value indicating the "mechanical health of the printer (0 to 100%)" that represent the
@@ -305,7 +228,8 @@ def combined_spectrogram(data1, data2):
# This result in a percentage value quantifying the machine behavior around the main resonances that give an hint if only touching belt tension # This result in a percentage value quantifying the machine behavior around the main resonances that give an hint if only touching belt tension
# will give good graphs or if there is a chance of mechanical issues in the background (above 50% should be considered as probably problematic) # will give good graphs or if there is a chance of mechanical issues in the background (above 50% should be considered as probably problematic)
def compute_mhi(combined_data, similarity_coefficient, num_unpaired_peaks): def compute_mhi(combined_data, similarity_coefficient, num_unpaired_peaks):
filtered_data = combined_data[combined_data > 100] # filtered_data = combined_data[combined_data > 100]
filtered_data = np.abs(combined_data)
# First compute a "total variability metric" based on the sum of the gradient that sum the magnitude of will emphasize regions of the # First compute a "total variability metric" based on the sum of the gradient that sum the magnitude of will emphasize regions of the
# spectrogram where there are rapid changes in magnitude (like the edges of resonance peaks). # spectrogram where there are rapid changes in magnitude (like the edges of resonance peaks).
@@ -330,15 +254,15 @@ def compute_mhi(combined_data, similarity_coefficient, num_unpaired_peaks):
def mhi_lut(mhi): def mhi_lut(mhi):
if 0 <= mhi <= 30: if 0 <= mhi <= 30:
return "Excellent mechanical health" return "Excellent mechanical health"
elif 31 <= mhi <= 45: elif 30 < mhi <= 45:
return "Good mechanical health" return "Good mechanical health"
elif 46 <= mhi <= 55: elif 45 < mhi <= 55:
return "Acceptable mechanical health" return "Acceptable mechanical health"
elif 56 <= mhi <= 70: elif 55 < mhi <= 70:
return "Potential signs of a mechanical issue" return "Potential signs of a mechanical issue"
elif 71 <= mhi <= 85: elif 70 < mhi <= 85:
return "Likely a mechanical issue" return "Likely a mechanical issue"
elif 86 <= mhi <= 100: elif 85 < mhi <= 100:
return "Mechanical issue detected" return "Mechanical issue detected"
@@ -451,23 +375,26 @@ def plot_compare_frequency(ax, lognames, signal1, signal2, max_freq):
def plot_difference_spectrogram(ax, data1, data2, signal1, signal2, similarity_factor, max_freq): def plot_difference_spectrogram(ax, data1, data2, signal1, signal2, similarity_factor, max_freq):
combined_data, bins, t = combined_spectrogram(data1, data2) combined_sum, combined_divergent, bins, t = combined_spectrogram(data1, data2)
# Compute the MHI value from the differential spectrogram sum of gradient, salted with # Compute the MHI value from the differential spectrogram sum of gradient, salted with
# the similarity factor and the number or unpaired peaks from the belts frequency profile # the similarity factor and the number or unpaired peaks from the belts frequency profile
# Be careful, this value is highly opinionated and is pretty experimental! # Be careful, this value is highly opinionated and is pretty experimental!
mhi, textual_mhi = compute_mhi(combined_data, similarity_factor, len(signal1.unpaired_peaks) + len(signal2.unpaired_peaks)) mhi, textual_mhi = compute_mhi(combined_sum, similarity_factor, len(signal1.unpaired_peaks) + len(signal2.unpaired_peaks))
print(f"[experimental] Mechanical Health Indicator: {textual_mhi.lower()} ({mhi:.1f}%)") print(f"[experimental] Mechanical Health Indicator: {textual_mhi.lower()} ({mhi:.1f}%)")
ax.set_title(f"Differential Spectrogram", fontsize=14, color=KLIPPAIN_COLORS['dark_orange'], weight='bold') ax.set_title(f"Differential Spectrogram", fontsize=14, color=KLIPPAIN_COLORS['dark_orange'], weight='bold')
ax.plot([], [], ' ', label=f'{textual_mhi} (experimental)') ax.plot([], [], ' ', label=f'{textual_mhi} (experimental)')
# Draw the differential spectrogram with a specific norm to get light grey zero values and red for max values (vmin to vcenter is not used) # Draw the differential spectrogram with a specific custom norm to get orange or purple values where there is signal or white near zeros
norm = matplotlib.colors.TwoSlopeNorm(vcenter=np.min(combined_data), vmax=np.max(combined_data)) colors = [KLIPPAIN_COLORS['dark_orange'], KLIPPAIN_COLORS['orange'], 'white', KLIPPAIN_COLORS['purple'], KLIPPAIN_COLORS['dark_purple']]
ax.pcolormesh(bins, t, combined_data.T, cmap='RdBu_r', norm=norm, shading='gouraud') cm = matplotlib.colors.LinearSegmentedColormap.from_list('klippain_divergent', list(zip([0, 0.25, 0.5, 0.75, 1], colors)))
norm = matplotlib.colors.TwoSlopeNorm(vmin=np.min(combined_divergent), vcenter=0, vmax=np.max(combined_divergent))
ax.pcolormesh(t, bins, combined_divergent.T, cmap=cm, norm=norm, shading='gouraud')
ax.set_xlabel('Frequency (hz)') ax.set_xlabel('Frequency (hz)')
ax.set_xlim([0., max_freq]) ax.set_xlim([0., max_freq])
ax.set_ylabel('Time (s)') ax.set_ylabel('Time (s)')
ax.set_ylim([0, t[-1]]) ax.set_ylim([0, bins[-1]])
fontP = matplotlib.font_manager.FontProperties() fontP = matplotlib.font_manager.FontProperties()
fontP.set_size('medium') fontP.set_size('medium')
@@ -476,16 +403,16 @@ def plot_difference_spectrogram(ax, data1, data2, signal1, signal2, similarity_f
# Plot vertical lines for unpaired peaks # Plot vertical lines for unpaired peaks
unpaired_peak_count = 0 unpaired_peak_count = 0
for _, peak in enumerate(signal1.unpaired_peaks): for _, peak in enumerate(signal1.unpaired_peaks):
ax.axvline(signal1.freqs[peak], color='red', linestyle='dotted', linewidth=1.5) ax.axvline(signal1.freqs[peak], color=KLIPPAIN_COLORS['red_pink'], linestyle='dotted', linewidth=1.5)
ax.annotate(f"Peak {unpaired_peak_count + 1}", (signal1.freqs[peak], t[-1]*0.05), ax.annotate(f"Peak {unpaired_peak_count + 1}", (signal1.freqs[peak], t[-1]*0.05),
textcoords="data", color='red', rotation=90, fontsize=10, textcoords="data", color=KLIPPAIN_COLORS['red_pink'], rotation=90, fontsize=10,
verticalalignment='bottom', horizontalalignment='right') verticalalignment='bottom', horizontalalignment='right')
unpaired_peak_count +=1 unpaired_peak_count +=1
for _, peak in enumerate(signal2.unpaired_peaks): for _, peak in enumerate(signal2.unpaired_peaks):
ax.axvline(signal2.freqs[peak], color='red', linestyle='dotted', linewidth=1.5) ax.axvline(signal2.freqs[peak], color=KLIPPAIN_COLORS['red_pink'], linestyle='dotted', linewidth=1.5)
ax.annotate(f"Peak {unpaired_peak_count + 1}", (signal2.freqs[peak], t[-1]*0.05), ax.annotate(f"Peak {unpaired_peak_count + 1}", (signal2.freqs[peak], t[-1]*0.05),
textcoords="data", color='red', rotation=90, fontsize=10, textcoords="data", color=KLIPPAIN_COLORS['red_pink'], rotation=90, fontsize=10,
verticalalignment='bottom', horizontalalignment='right') verticalalignment='bottom', horizontalalignment='right')
unpaired_peak_count +=1 unpaired_peak_count +=1
@@ -494,11 +421,11 @@ def plot_difference_spectrogram(ax, data1, data2, signal1, signal2, similarity_f
label = ALPHABET[idx] label = ALPHABET[idx]
x_min = min(peak1[1], peak2[1]) x_min = min(peak1[1], peak2[1])
x_max = max(peak1[1], peak2[1]) x_max = max(peak1[1], peak2[1])
ax.axvline(x_min, color=KLIPPAIN_COLORS['purple'], linestyle='dotted', linewidth=1.5) ax.axvline(x_min, color=KLIPPAIN_COLORS['dark_purple'], linestyle='dotted', linewidth=1.5)
ax.axvline(x_max, color=KLIPPAIN_COLORS['purple'], linestyle='dotted', linewidth=1.5) ax.axvline(x_max, color=KLIPPAIN_COLORS['dark_purple'], linestyle='dotted', linewidth=1.5)
ax.fill_between([x_min, x_max], 0, np.max(combined_data), color=KLIPPAIN_COLORS['purple'], alpha=0.3) ax.fill_between([x_min, x_max], 0, np.max(combined_divergent), color=KLIPPAIN_COLORS['dark_purple'], alpha=0.3)
ax.annotate(f"Peaks {label}", (x_min, t[-1]*0.05), ax.annotate(f"Peaks {label}", (x_min, t[-1]*0.05),
textcoords="data", color=KLIPPAIN_COLORS['purple'], rotation=90, fontsize=10, textcoords="data", color=KLIPPAIN_COLORS['dark_purple'], rotation=90, fontsize=10,
verticalalignment='bottom', horizontalalignment='right') verticalalignment='bottom', horizontalalignment='right')
return return

View File

@@ -6,19 +6,7 @@
# Derived from the calibrate_shaper.py official Klipper script # Derived from the calibrate_shaper.py official Klipper script
# Copyright (C) 2020 Dmitry Butyugin <dmbutyugin@google.com> # Copyright (C) 2020 Dmitry Butyugin <dmbutyugin@google.com>
# Copyright (C) 2020 Kevin O'Connor <kevin@koconnor.net> # Copyright (C) 2020 Kevin O'Connor <kevin@koconnor.net>
#
# Written by Frix_x#0161 # # Written by Frix_x#0161 #
# @version: 2.0
# CHANGELOG:
# v2.0: updated the script to align it to the new K-Shake&Tune module
# v1.1: - improved the damping ratio computation with linear approximation for more precision
# - reworked the top graph to add more information to it with colored zones,
# automated peak detection, etc...
# - added a full spectrogram of the signal on the bottom to allow deeper analysis
# v1.0: first version of this script inspired from the official Klipper
# shaper calibration script to add an automatic damping ratio estimation to it
# Be sure to make this script executable using SSH: type 'chmod +x ./graph_shaper.py' when in the folder! # Be sure to make this script executable using SSH: type 'chmod +x ./graph_shaper.py' when in the folder!
@@ -29,21 +17,19 @@
import optparse, matplotlib, sys, importlib, os, math import optparse, matplotlib, sys, importlib, os, math
from textwrap import wrap from textwrap import wrap
import numpy as np import numpy as np
import scipy
import matplotlib.pyplot, matplotlib.dates, matplotlib.font_manager import matplotlib.pyplot, matplotlib.dates, matplotlib.font_manager
import matplotlib.ticker, matplotlib.gridspec import matplotlib.ticker, matplotlib.gridspec
import locale import locale
from datetime import datetime from datetime import datetime
matplotlib.use('Agg') matplotlib.use('Agg')
try:
locale.setlocale(locale.LC_TIME, locale.getdefaultlocale())
except locale.Error:
locale.setlocale(locale.LC_TIME, 'C')
PEAKS_DETECTION_THRESHOLD = 0.05 PEAKS_DETECTION_THRESHOLD = 0.05
PEAKS_EFFECT_THRESHOLD = 0.12 PEAKS_EFFECT_THRESHOLD = 0.12
SPECTROGRAM_LOW_PERCENTILE_FILTER = 5 SPECTROGRAM_LOW_PERCENTILE_FILTER = 5
MAX_SMOOTHING = 0.1
KLIPPAIN_COLORS = { KLIPPAIN_COLORS = {
"purple": "#70088C", "purple": "#70088C",
@@ -52,6 +38,22 @@ KLIPPAIN_COLORS = {
} }
# Set the best locale for time and date formating (generation of the titles)
try:
locale.setlocale(locale.LC_TIME, locale.getdefaultlocale())
except locale.Error:
locale.setlocale(locale.LC_TIME, 'C')
# Override the built-in print function to avoid problem in Klipper due to locale settings
original_print = print
def print_with_c_locale(*args, **kwargs):
original_locale = locale.setlocale(locale.LC_ALL, None)
locale.setlocale(locale.LC_ALL, 'C')
original_print(*args, **kwargs)
locale.setlocale(locale.LC_ALL, original_locale)
print = print_with_c_locale
###################################################################### ######################################################################
# Computation # Computation
###################################################################### ######################################################################
@@ -73,7 +75,7 @@ def calibrate_shaper_with_damping(datas, max_smoothing):
fr, zeta = compute_damping_ratio(psd, freqs) fr, zeta = compute_damping_ratio(psd, freqs)
print("Recommended shaper is %s @ %.1f Hz" % (shaper.name, shaper.freq)) print("Recommended shaper is %s @ %.1f Hz" % (shaper.name, shaper.freq))
print("Axis has a resonant frequency ω0=%.1fHz with an estimated damping ratio ζ=%.3f" % (fr, zeta)) print("Axis has a main resonant frequency at %.1fHz with an estimated damping ratio of %.3f" % (fr, zeta))
return shaper.name, all_shapers, calibration_data, fr, zeta return shaper.name, all_shapers, calibration_data, fr, zeta
@@ -98,20 +100,22 @@ def compute_damping_ratio(psd, freqs):
def compute_spectrogram(data): def compute_spectrogram(data):
N = data.shape[0] N = data.shape[0]
Fs = N / (data[-1,0] - data[0,0]) Fs = N / (data[-1, 0] - data[0, 0])
# Round up to a power of 2 for faster FFT # Round up to a power of 2 for faster FFT
M = 1 << int(.5 * Fs - 1).bit_length() M = 1 << int(.5 * Fs - 1).bit_length()
window = np.kaiser(M, 6.) window = np.kaiser(M, 6.)
def _specgram(x):
return matplotlib.mlab.specgram(
x, Fs=Fs, NFFT=M, noverlap=M//2, window=window,
mode='psd', detrend='mean', scale_by_freq=False)
d = {'x': data[:,1], 'y': data[:,2], 'z': data[:,3]} def _specgram(x):
pdata, bins, t = _specgram(d['x']) x_detrended = x - np.mean(x) # Detrending by subtracting the mean value
for ax in 'yz': return scipy.signal.spectrogram(
pdata += _specgram(d[ax])[0] x_detrended, fs=Fs, window=window, nperseg=M, noverlap=M//2,
return pdata, bins, t detrend='constant', scaling='density', mode='psd')
d = {'x': data[:, 1], 'y': data[:, 2], 'z': data[:, 3]}
f, t, pdata = _specgram(d['x'])
for axis in 'yz':
pdata += _specgram(d[axis])[2]
return pdata, t, f
# This find all the peaks in a curve by looking at when the derivative term goes from positive to negative # This find all the peaks in a curve by looking at when the derivative term goes from positive to negative
@@ -151,7 +155,7 @@ def detect_peaks(psd, freqs, window_size=5, vicinity=3):
# Graphing # Graphing
###################################################################### ######################################################################
def plot_freq_response_with_damping(ax, calibration_data, shapers, selected_shaper, fr, zeta, max_freq): def plot_freq_response_with_damping(ax, calibration_data, shapers, performance_shaper, fr, zeta, max_freq):
freqs = calibration_data.freq_bins freqs = calibration_data.freq_bins
psd = calibration_data.psd_sum[freqs <= max_freq] psd = calibration_data.psd_sum[freqs <= max_freq]
px = calibration_data.psd_x[freqs <= max_freq] px = calibration_data.psd_x[freqs <= max_freq]
@@ -181,30 +185,50 @@ def plot_freq_response_with_damping(ax, calibration_data, shapers, selected_shap
ax2 = ax.twinx() ax2 = ax.twinx()
ax2.yaxis.set_visible(False) ax2.yaxis.set_visible(False)
best_shaper_vals = None lowvib_shaper_vibrs = float('inf')
no_vibr_shaper = None lowvib_shaper = None
no_vibr_shaper_freq = None lowvib_shaper_freq = None
no_vibr_shaper_accel = 0 lowvib_shaper_accel = 0
# Draw the shappers curves and add their specific parameters in the legend # Draw the shappers curves and add their specific parameters in the legend
# This adds also a way to find the best shaper with 0% of vibrations (to be printed in the legend later) # This adds also a way to find the best shaper with a low level of vibrations (with a resonable level of smoothing)
for shaper in shapers: for shaper in shapers:
shaper_max_accel = round(shaper.max_accel / 100.) * 100. shaper_max_accel = round(shaper.max_accel / 100.) * 100.
label = "%s (%.1f Hz, vibr=%.1f%%, sm~=%.2f, accel<=%.f)" % ( label = "%s (%.1f Hz, vibr=%.1f%%, sm~=%.2f, accel<=%.f)" % (
shaper.name.upper(), shaper.freq, shaper.name.upper(), shaper.freq,
shaper.vibrs * 100., shaper.smoothing, shaper.vibrs * 100., shaper.smoothing,
shaper_max_accel) shaper_max_accel)
linestyle = 'dotted' ax2.plot(freqs, shaper.vals, label=label, linestyle='dotted')
if shaper.name == selected_shaper:
linestyle = 'dashdot' # Get the performance shaper
selected_shaper_freq = shaper.freq if shaper.name == performance_shaper:
best_shaper_vals = shaper.vals performance_shaper_freq = shaper.freq
if (shaper.vibrs * 100 == 0.) and (shaper_max_accel > no_vibr_shaper_accel): performance_shaper_vibr = shaper.vibrs * 100.
no_vibr_shaper_accel = shaper_max_accel performance_shaper_vals = shaper.vals
no_vibr_shaper = shaper.name
no_vibr_shaper_freq = shaper.freq # Get the low vibration shaper
ax2.plot(freqs, shaper.vals, label=label, linestyle=linestyle) if (shaper.vibrs * 100 < lowvib_shaper_vibrs or (shaper.vibrs * 100 == lowvib_shaper_vibrs and shaper_max_accel > lowvib_shaper_accel)) and shaper.smoothing < MAX_SMOOTHING:
ax.plot(freqs, psd * best_shaper_vals, label='With %s applied' % (selected_shaper.upper()), color='cyan') lowvib_shaper_accel = shaper_max_accel
lowvib_shaper = shaper.name
lowvib_shaper_freq = shaper.freq
lowvib_shaper_vibrs = shaper.vibrs * 100
lowvib_shaper_vals = shaper.vals
# User recommendations are added to the legend: one is Klipper's original suggestion that is usually good for performances
# and the other one is the custom "low vibration" recommendation that looks for a suitable shaper that doesn't have excessive
# smoothing and that have a lower vibration level. If both recommendation are the same shaper, or if no suitable "low
# vibration" shaper is found, then only a single line as the "best shaper" recommendation is added to the legend
if lowvib_shaper != None and lowvib_shaper != performance_shaper and lowvib_shaper_vibrs <= performance_shaper_vibr:
ax2.plot([], [], ' ', label="Recommended performance shaper: %s @ %.1f Hz" % (performance_shaper.upper(), performance_shaper_freq))
ax.plot(freqs, psd * performance_shaper_vals, label='With %s applied' % (performance_shaper.upper()), color='cyan')
ax2.plot([], [], ' ', label="Recommended low vibrations shaper: %s @ %.1f Hz" % (lowvib_shaper.upper(), lowvib_shaper_freq))
ax.plot(freqs, psd * lowvib_shaper_vals, label='With %s applied' % (lowvib_shaper.upper()), color='lime')
else:
ax2.plot([], [], ' ', label="Recommended best shaper: %s @ %.1f Hz" % (performance_shaper.upper(), performance_shaper_freq))
ax.plot(freqs, psd * performance_shaper_vals, label='With %s applied' % (performance_shaper.upper()), color='cyan')
# And the estimated damping ratio is finally added at the end of the legend
ax2.plot([], [], ' ', label="Estimated damping ratio (ζ): %.3f" % (zeta))
# Draw the detected peaks and name them # Draw the detected peaks and name them
# This also draw the detection threshold and warning threshold (aka "effect zone") # This also draw the detection threshold and warning threshold (aka "effect zone")
@@ -228,10 +252,6 @@ def plot_freq_response_with_damping(ax, calibration_data, shapers, selected_shap
ax.fill_between(freqs, 0, peaks_warning_threshold, color='green', alpha=0.15, label='Relax Region') ax.fill_between(freqs, 0, peaks_warning_threshold, color='green', alpha=0.15, label='Relax Region')
ax.fill_between(freqs, peaks_warning_threshold, peaks_effect_threshold, color='orange', alpha=0.2, label='Warning Region') ax.fill_between(freqs, peaks_warning_threshold, peaks_effect_threshold, color='orange', alpha=0.2, label='Warning Region')
# Final user recommendations added to the legend with an added 0% vibration shaper and the estimated damping ratio over stock Klipper's algorithms
ax2.plot([], [], ' ', label="Recommended shaper: %s @ %.1f Hz" % (selected_shaper.upper(), selected_shaper_freq))
ax2.plot([], [], ' ', label="Recommended low vibrations shaper: %s @ %.1f Hz" % (no_vibr_shaper.upper(), no_vibr_shaper_freq))
ax2.plot([], [], ' ', label="Estimated damping ratio (ζ): %.3f" % (zeta))
# Add the main resonant frequency and damping ratio of the axis to the graph title # Add the main resonant frequency and damping ratio of the axis to the graph title
ax.set_title("Axis Frequency Profile (ω0=%.1fHz, ζ=%.3f)" % (fr, zeta), fontsize=14, color=KLIPPAIN_COLORS['dark_orange'], weight='bold') ax.set_title("Axis Frequency Profile (ω0=%.1fHz, ζ=%.3f)" % (fr, zeta), fontsize=14, color=KLIPPAIN_COLORS['dark_orange'], weight='bold')
@@ -253,7 +273,7 @@ def plot_spectrogram(ax, data, peaks, max_freq):
vmin_value = np.percentile(pdata, SPECTROGRAM_LOW_PERCENTILE_FILTER) vmin_value = np.percentile(pdata, SPECTROGRAM_LOW_PERCENTILE_FILTER)
ax.set_title("Time-Frequency Spectrogram", fontsize=14, color=KLIPPAIN_COLORS['dark_orange'], weight='bold') ax.set_title("Time-Frequency Spectrogram", fontsize=14, color=KLIPPAIN_COLORS['dark_orange'], weight='bold')
ax.pcolormesh(bins, t, pdata.T, norm=matplotlib.colors.LogNorm(vmin=vmin_value), ax.pcolormesh(t, bins, pdata.T, norm=matplotlib.colors.LogNorm(vmin=vmin_value),
cmap='inferno', shading='gouraud') cmap='inferno', shading='gouraud')
# Add peaks lines in the spectrogram to get hint from peaks found in the first graph # Add peaks lines in the spectrogram to get hint from peaks found in the first graph
@@ -303,7 +323,7 @@ def shaper_calibration(lognames, klipperdir="~/klipper", max_smoothing=None, max
datas = [parse_log(fn) for fn in lognames] datas = [parse_log(fn) for fn in lognames]
# Calibrate shaper and generate outputs # Calibrate shaper and generate outputs
selected_shaper, shapers, calibration_data, fr, zeta = calibrate_shaper_with_damping(datas, max_smoothing) performance_shaper, shapers, calibration_data, fr, zeta = calibrate_shaper_with_damping(datas, max_smoothing)
fig = matplotlib.pyplot.figure() fig = matplotlib.pyplot.figure()
gs = matplotlib.gridspec.GridSpec(2, 1, height_ratios=[4, 3]) gs = matplotlib.gridspec.GridSpec(2, 1, height_ratios=[4, 3])
@@ -323,7 +343,7 @@ def shaper_calibration(lognames, klipperdir="~/klipper", max_smoothing=None, max
fig.text(0.12, 0.957, title_line2, ha='left', va='top', fontsize=16, color=KLIPPAIN_COLORS['dark_purple']) fig.text(0.12, 0.957, title_line2, ha='left', va='top', fontsize=16, color=KLIPPAIN_COLORS['dark_purple'])
# Plot the graphs # Plot the graphs
peaks = plot_freq_response_with_damping(ax1, calibration_data, shapers, selected_shaper, fr, zeta, max_freq) peaks = plot_freq_response_with_damping(ax1, calibration_data, shapers, performance_shaper, fr, zeta, max_freq)
plot_spectrogram(ax2, datas[0], peaks, max_freq) plot_spectrogram(ax2, datas[0], peaks, max_freq)
fig.set_size_inches(8.3, 11.6) fig.set_size_inches(8.3, 11.6)

View File

@@ -4,15 +4,6 @@
###### SPEED AND VIBRATIONS PLOTTING SCRIPT ###### ###### SPEED AND VIBRATIONS PLOTTING SCRIPT ######
################################################## ##################################################
# Written by Frix_x#0161 # # Written by Frix_x#0161 #
# @version: 2.0
# CHANGELOG:
# v2.0: - updated the script to align it to the new K-Shake&Tune module
# - new features for peaks detection and advised speed zones
# v1.2: fixed a bug that could happen when username is not "pi" (thanks @spikeygg)
# v1.1: better graph formatting
# v1.0: first version of the script
# Be sure to make this script executable using SSH: type 'chmod +x ./graph_vibrations.py' when in the folder ! # Be sure to make this script executable using SSH: type 'chmod +x ./graph_vibrations.py' when in the folder !
@@ -29,10 +20,6 @@ import locale
from datetime import datetime from datetime import datetime
matplotlib.use('Agg') matplotlib.use('Agg')
try:
locale.setlocale(locale.LC_TIME, locale.getdefaultlocale())
except locale.Error:
locale.setlocale(locale.LC_TIME, 'C')
PEAKS_DETECTION_THRESHOLD = 0.05 PEAKS_DETECTION_THRESHOLD = 0.05
@@ -46,6 +33,22 @@ KLIPPAIN_COLORS = {
} }
# Set the best locale for time and date formating (generation of the titles)
try:
locale.setlocale(locale.LC_TIME, locale.getdefaultlocale())
except locale.Error:
locale.setlocale(locale.LC_TIME, 'C')
# Override the built-in print function to avoid problem in Klipper due to locale settings
original_print = print
def print_with_c_locale(*args, **kwargs):
original_locale = locale.setlocale(locale.LC_ALL, None)
locale.setlocale(locale.LC_ALL, 'C')
original_print(*args, **kwargs)
locale.setlocale(locale.LC_ALL, original_locale)
print = print_with_c_locale
###################################################################### ######################################################################
# Computation # Computation
###################################################################### ######################################################################
@@ -314,7 +317,7 @@ def parse_log(logname):
return np.loadtxt(logname, comments='#', delimiter=',') return np.loadtxt(logname, comments='#', delimiter=',')
# Power spectral density data or shaper calibration data # Power spectral density data or shaper calibration data
raise ValueError("File %s does not contain raw accelerometer data and therefore " raise ValueError("File %s does not contain raw accelerometer data and therefore "
"is not supported by graph_vibrations.py script. Please use " "is not supported by this script. Please use the official Klipper"
"calibrate_shaper.py script to process it instead." % (logname,)) "calibrate_shaper.py script to process it instead." % (logname,))
@@ -323,7 +326,7 @@ def extract_speed(logname):
speed = re.search('sp(.+?)n', os.path.basename(logname)).group(1).replace('_','.') speed = re.search('sp(.+?)n', os.path.basename(logname)).group(1).replace('_','.')
except AttributeError: except AttributeError:
raise ValueError("File %s does not contain speed in its name and therefore " raise ValueError("File %s does not contain speed in its name and therefore "
"is not supported by graph_vibrations.py script." % (logname,)) "is not supported by this script." % (logname,))
return float(speed) return float(speed)

View File

@@ -1,21 +1,9 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
############################################ ############################################
###### INPUT SHAPER KLIPPAIN WORKFLOW ###### ###### INPUT SHAPER KLIPPAIN WORKFLOW ######
############################################ ############################################
# Written by Frix_x#0161 # # Written by Frix_x#0161 #
# @version: 2.0
# CHANGELOG:
# v2.0: new version of this as a Python script (to replace the old bash script) and implement the newer and improved shaper plotting scripts
# v1.7: updated the handling of shaper files to account for the new analysis scripts as we are now using raw data directly
# v1.6: - updated the handling of shaper graph files to be able to optionnaly account for added positions in the filenames and remove them
# - fixed a bug in the belt graph on slow SD card or Pi clones (Klipper was still writing in the file while we were already reading it)
# v1.5: fixed klipper unnexpected fail at the end of the execution, even if graphs were correctly generated (unicode decode error fixed)
# v1.4: added the ~/klipper dir parameter to the call of graph_vibrations.py for a better user handling (in case user is not "pi")
# v1.3: some documentation improvement regarding the line endings that needs to be LF for this file
# v1.2: added the movement name to be transfered to the Python script in vibration calibration (to print it on the result graphs)
# v1.1: multiple fixes and tweaks (mainly to avoid having empty files read by the python scripts after the mv command)
# v1.0: first version of the script based on a Zellneralex script
# Usage: # Usage:
# This script was designed to be used with gcode_shell_commands directly from Klipper # This script was designed to be used with gcode_shell_commands directly from Klipper
@@ -25,7 +13,6 @@
# VIBRATIONS - To generate vibration diagram after calling the custom (Frix_x#0161) VIBRATIONS_CALIBRATION macro # VIBRATIONS - To generate vibration diagram after calling the custom (Frix_x#0161) VIBRATIONS_CALIBRATION macro
import os import os
import time import time
import glob import glob
@@ -43,6 +30,7 @@ STORE_RESULTS = 3
from graph_belts import belts_calibration from graph_belts import belts_calibration
from graph_shaper import shaper_calibration from graph_shaper import shaper_calibration
from graph_vibrations import vibrations_calibration from graph_vibrations import vibrations_calibration
from analyze_axesmap import axesmap_calibration
RESULTS_SUBFOLDERS = ['belts', 'inputshaper', 'vibrations'] RESULTS_SUBFOLDERS = ['belts', 'inputshaper', 'vibrations']
@@ -55,69 +43,104 @@ def is_file_open(filepath):
if os.path.samefile(fd, filepath): if os.path.samefile(fd, filepath):
return True return True
except FileNotFoundError: except FileNotFoundError:
# Klipper has already released the CSV file
pass
except PermissionError:
# Unable to check for this particular process due to permissions
pass pass
return False return False
def get_belts_graph(): def create_belts_graph():
current_date = datetime.now().strftime('%Y%m%d_%H%M%S') current_date = datetime.now().strftime('%Y%m%d_%H%M%S')
lognames = [] lognames = []
for filename in glob.glob('/tmp/raw_data_axis*.csv'): globbed_files = glob.glob('/tmp/raw_data_axis*.csv')
if not globbed_files:
print("No CSV files found in the /tmp folder to create the belt graphs!")
sys.exit(1)
if len(globbed_files) < 2:
print("Not enough CSV files found in the /tmp folder. Two files are required for the belt graphs!")
sys.exit(1)
sorted_files = sorted(globbed_files, key=os.path.getmtime, reverse=True)
for filename in sorted_files[:2]:
# Wait for the file handler to be released by Klipper # Wait for the file handler to be released by Klipper
while is_file_open(filename): while is_file_open(filename):
time.sleep(3) time.sleep(2)
# Extract the tested belt from the filename and rename/move the CSV file to the result folder # Extract the tested belt from the filename and rename/move the CSV file to the result folder
belt = os.path.basename(filename).split('_')[3].split('.')[0].upper() belt = os.path.basename(filename).split('_')[3].split('.')[0].upper()
new_file = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[0], f'belt_{current_date}_{belt}.csv') new_file = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[0], f'belt_{current_date}_{belt}.csv')
shutil.move(filename, new_file) shutil.move(filename, new_file)
os.sync() # Sync filesystem to avoid problems
# Save the file path for later # Save the file path for later
lognames.append(new_file) lognames.append(new_file)
# Wait for the file handler to be released by the move command
while is_file_open(new_file):
time.sleep(2)
# Generate the belts graph and its name # Generate the belts graph and its name
fig = belts_calibration(lognames, KLIPPER_FOLDER) fig = belts_calibration(lognames, KLIPPER_FOLDER)
png_filename = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[0], f'belts_{current_date}.png') png_filename = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[0], f'belts_{current_date}.png')
return fig, png_filename fig.savefig(png_filename)
return
def get_shaper_graph(): def create_shaper_graph():
current_date = datetime.now().strftime('%Y%m%d_%H%M%S') current_date = datetime.now().strftime('%Y%m%d_%H%M%S')
# Get all the files and sort them based on last modified time to select the most recent one
globbed_files = glob.glob('/tmp/raw_data*.csv') globbed_files = glob.glob('/tmp/raw_data*.csv')
if len(globbed_files) > 1: if not globbed_files:
print("There is more than 1 measurement.csv found in the /tmp folder. Unable to plot the shaper graphs!") print("No CSV files found in the /tmp folder to create the input shaper graphs!")
print("Please clean the files in the /tmp folder and start again.")
sys.exit(1) sys.exit(1)
filename = globbed_files[0] sorted_files = sorted(globbed_files, key=os.path.getmtime, reverse=True)
filename = sorted_files[0]
# Wait for the file handler to be released by Klipper # Wait for the file handler to be released by Klipper
while is_file_open(filename): while is_file_open(filename):
time.sleep(3) time.sleep(2)
# Extract the tested axis from the filename and rename/move the CSV file to the result folder # Extract the tested axis from the filename and rename/move the CSV file to the result folder
axis = os.path.basename(filename).split('_')[3].split('.')[0].upper() axis = os.path.basename(filename).split('_')[3].split('.')[0].upper()
new_file = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[1], f'resonances_{current_date}_{axis}.csv') new_file = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[1], f'resonances_{current_date}_{axis}.csv')
shutil.move(filename, new_file) shutil.move(filename, new_file)
os.sync() # Sync filesystem to avoid problems
# Wait for the file handler to be released by the move command
while is_file_open(new_file):
time.sleep(2)
# Generate the shaper graph and its name # Generate the shaper graph and its name
fig = shaper_calibration([new_file], KLIPPER_FOLDER) fig = shaper_calibration([new_file], KLIPPER_FOLDER)
png_filename = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[1], f'resonances_{current_date}_{axis}.png') png_filename = os.path.join(RESULTS_FOLDER, RESULTS_SUBFOLDERS[1], f'resonances_{current_date}_{axis}.png')
return fig, png_filename fig.savefig(png_filename)
return
def get_vibrations_graph(axis_name): def create_vibrations_graph(axis_name):
current_date = datetime.now().strftime('%Y%m%d_%H%M%S') current_date = datetime.now().strftime('%Y%m%d_%H%M%S')
lognames = [] lognames = []
for filename in glob.glob('/tmp/adxl345-*.csv'): globbed_files = glob.glob('/tmp/adxl345-*.csv')
if not globbed_files:
print("No CSV files found in the /tmp folder to create the vibration graphs!")
sys.exit(1)
if len(globbed_files) < 3:
print("Not enough CSV files found in the /tmp folder. At least 3 files are required for the vibration graphs!")
sys.exit(1)
for filename in globbed_files:
# Wait for the file handler to be released by Klipper # Wait for the file handler to be released by Klipper
while is_file_open(filename): while is_file_open(filename):
time.sleep(3) time.sleep(2)
# Cleanup of the filename and moving it in the result folder # Cleanup of the filename and moving it in the result folder
cleanfilename = os.path.basename(filename).replace('adxl345', f'vibr_{current_date}') cleanfilename = os.path.basename(filename).replace('adxl345', f'vibr_{current_date}')
@@ -129,6 +152,7 @@ def get_vibrations_graph(axis_name):
# Sync filesystem to avoid problems as there is a lot of file copied # Sync filesystem to avoid problems as there is a lot of file copied
os.sync() os.sync()
time.sleep(5)
# Generate the vibration graph and its name # Generate the vibration graph and its name
fig = vibrations_calibration(lognames, KLIPPER_FOLDER, axis_name) fig = vibrations_calibration(lognames, KLIPPER_FOLDER, axis_name)
@@ -140,7 +164,35 @@ def get_vibrations_graph(axis_name):
tar.add(csv_file, recursive=False) tar.add(csv_file, recursive=False)
os.remove(csv_file) os.remove(csv_file)
return fig, png_filename fig.savefig(png_filename)
return
def find_axesmap(accel):
current_date = datetime.now().strftime('%Y%m%d_%H%M%S')
result_filename = os.path.join(RESULTS_FOLDER, f'axes_map_{current_date}.txt')
lognames = []
globbed_files = glob.glob('/tmp/adxl345-*.csv')
if not globbed_files:
print("No CSV files found in the /tmp folder to analyze and find the axes_map!")
sys.exit(1)
sorted_files = sorted(globbed_files, key=os.path.getmtime, reverse=True)
filename = sorted_files[0]
# Wait for the file handler to be released by Klipper
while is_file_open(filename):
time.sleep(2)
# Analyze the CSV to find the axes_map parameter
lognames.append(filename)
results = axesmap_calibration(lognames, accel)
with open(result_filename, 'w') as f:
f.write(results)
return
# Utility function to get old files based on their modification time # Utility function to get old files based on their modification time
@@ -191,20 +243,21 @@ def main():
os.makedirs(folder) os.makedirs(folder)
if len(sys.argv) < 2: if len(sys.argv) < 2:
print("Usage: plot_graphs.py [SHAPER|BELTS|VIBRATIONS]") print("Usage: is_workflow.py [BELTS|SHAPER|VIBRATIONS|AXESMAP]")
sys.exit(1) sys.exit(1)
if sys.argv[1].lower() == 'belts': if sys.argv[1].lower() == 'belts':
fig, png_filename = get_belts_graph() create_belts_graph()
elif sys.argv[1].lower() == 'shaper': elif sys.argv[1].lower() == 'shaper':
fig, png_filename = get_shaper_graph() create_shaper_graph()
elif sys.argv[1].lower() == 'vibrations': elif sys.argv[1].lower() == 'vibrations':
fig, png_filename = get_vibrations_graph(axis_name=sys.argv[2]) create_vibrations_graph(axis_name=sys.argv[2])
elif sys.argv[1].lower() == 'axesmap':
find_axesmap(accel=sys.argv[2])
else: else:
print("Usage: plot_graphs.py [SHAPER|BELTS|VIBRATIONS]") print("Usage: is_workflow.py [BELTS|SHAPER|VIBRATIONS|AXESMAP]")
sys.exit(1) sys.exit(1)
fig.savefig(png_filename)
clean_files() clean_files()
print(f"Graphs created. You will find the results in {RESULTS_FOLDER}") print(f"Graphs created. You will find the results in {RESULTS_FOLDER}")

View File

@@ -0,0 +1,5 @@
#!/usr/bin/env bash
source ~/klippain_shaketune-env/bin/activate
python ~/klippain_shaketune/K-ShakeTune/scripts/is_workflow.py "$@"
deactivate

View File

@@ -0,0 +1,6 @@
[gcode_shell_command shaketune]
command: ~/printer_data/config/K-ShakeTune/scripts/shaketune.sh
timeout: 600.0
verbose: True
[respond]

View File

@@ -11,24 +11,25 @@ It operates in two steps:
2. Relocates the graphs and associated CSV files to your Klipper config folder for easy access via Mainsail/Fluidd to eliminate the need for SSH. 2. Relocates the graphs and associated CSV files to your Klipper config folder for easy access via Mainsail/Fluidd to eliminate the need for SSH.
3. Manages the folder by retaining only the most recent results (default setting of keeping the latest three sets). 3. Manages the folder by retaining only the most recent results (default setting of keeping the latest three sets).
The [detailed documentation is here](./docs/README.md). Check out the **[detailed documentation of the Shake&Tune module here](./docs/README.md)**. You can also look at the documentation for each type of graph by directly clicking on them below to better understand your results and tune your machine!
| Belts graphs | Axis graphs | Vibrations measurement | | [Belts graph](./docs/macros/belts_tuning.md) | [Axis input shaper graphs](./docs/macros/axis_tuning.md) | [Vibrations graph](./docs/macros/vibrations_tuning.md) |
|:----------------:|:------------:|:---------------------:| |:----------------:|:------------:|:---------------------:|
| ![](./docs/images/belts_example.png) | ![](./docs/images/axis_example.png) | ![](./docs/images/vibrations_example.png) | | [<img src="./docs/images/belts_example.png">](./docs/macros/belts_tuning.md) | [<img src="./docs/images/axis_example.png">](./docs/macros/axis_tuning.md) | [<img src="./docs/images/vibrations_example.png">](./docs/macros/vibrations_tuning.md) |
## Installation ## Installation
For those not using the full [Klippain](https://github.com/Frix-x/klippain), follow these steps to integrate this Shake&Tune module in your setup: Follow these steps to install the Shake&Tune module in your printer:
1. Run the install script over SSH on your printer: 1. Be sure to have a working accelerometer on your machine. You can follow the official [Measuring Resonances Klipper documentation](https://www.klipper3d.org/Measuring_Resonances.html) to configure one. Validate with an `ACCELEROMETER_QUERY` command that everything works correctly.
1. Then, you can install the Shake&Tune package by running over SSH on your printer:
```bash ```bash
wget -O - https://raw.githubusercontent.com/Frix-x/klippain-shaketune/main/install.sh | bash wget -O - https://raw.githubusercontent.com/Frix-x/klippain-shaketune/main/install.sh | bash
``` ```
2. Append the following to your `printer.cfg` file: 1. Finally, append the following to your `printer.cfg` file and restart Klipper (if prefered, you can include only the needed macros: using `*.cfg` is a convenient way to include them all at once):
``` ```
[include K-ShakeTune/*.cfg] [include K-ShakeTune/*.cfg]
``` ```
3. Optionally, if you want to get automatic updates, add the following to your `moonraker.cfg` file: 1. Optionally, if you want to get automatic updates, add the following to your `moonraker.cfg` file:
``` ```
[update_manager Klippain-ShakeTune] [update_manager Klippain-ShakeTune]
type: git_repo type: git_repo
@@ -42,14 +43,15 @@ For those not using the full [Klippain](https://github.com/Frix-x/klippain), fol
> **Note**: > **Note**:
> >
> If already using my old IS workflow scripts, please remove everything before installing this new module. This include the macros, the Python scripts, the `plot_graph.sh` and the `[gcode_shell_command plot_graph]` section. > If already using my old IS workflow scripts, please remove everything before installing this new module. This include the macros, the Python scripts, the `plot_graph.sh` and the `[gcode_shell_command plot_graph]` section that are not needed anymore.
## Usage ## Usage
Ensure your machine is homed, then invoke one of the following macros as needed: Ensure your machine is homed, then invoke one of the following macros as needed:
- `AXES_MAP_CALIBRATION` to automatically find Klipper's `axes_map` parameter for your accelerometer orientation (be careful, this is experimental for now).
- `BELTS_SHAPER_CALIBRATION` for belt resonance graphs, useful for verifying belt tension and differential belt paths behavior. - `BELTS_SHAPER_CALIBRATION` for belt resonance graphs, useful for verifying belt tension and differential belt paths behavior.
- `AXES_SHAPER_CALIBRATION` for input shaper graphs to mitigate ringing/ghosting by tuning Klipper's input shaper system. - `AXES_SHAPER_CALIBRATION` for input shaper graphs to mitigate ringing/ghosting by tuning Klipper's input shaper system.
- `VIBRATIONS_CALIBRATION` for machine vibration graphs to optimize your slicer speed profiles. - `VIBRATIONS_CALIBRATION` for machine and motors vibration graphs, used to optimize your slicer speed profiles and TMC drivers parameters.
- `EXCITATE_AXIS_AT_FREQ` to sustain a specific excitation frequency, useful to let you inspect and find out what is resonating. - `EXCITATE_AXIS_AT_FREQ` to sustain a specific excitation frequency, useful to let you inspect and find out what is resonating.
For further insights on the usage of the macros and the generated graphs, refer to the [K-Shake&Tune module documentation](./docs/README.md). For further insights on the usage of these macros and the generated graphs, refer to the [K-Shake&Tune module documentation](./docs/README.md).

View File

@@ -1,14 +1,59 @@
# Klippain Shake&Tune module documentation # Klippain Shake&Tune module documentation
### Detailed documentation ![](./banner_long.png)
1. [Input Shaping and tuning generalities](./is_tuning_generalities.md) ## Resonance testing
1. [Belt graphs](./macros/belts_tuning.md)
1. [Axis Input Shaper graphs](./macros/axis_tuning.md)
1. [Klippain vibrations graphs](./macros/vibrations_tuning.md)
![](./banner.png) First, check out the **[input shaping and tuning generalities](./is_tuning_generalities.md)** documentation to understand how it all works and what to look for when taking these measurements.
### Complementary ressources Then look at the documentation for each type of graph by clicking on them below tu run the tests and better understand your results to tune your machine!
| [Belts graph](./macros/belts_tuning.md) | [Axis input shaper graphs](./macros/axis_tuning.md) | [Vibrations graph](./macros/vibrations_tuning.md) |
|:----------------:|:------------:|:---------------------:|
| [<img src="./images/belts_example.png">](./macros/belts_tuning.md) | [<img src="./images/axis_example.png">](./macros/axis_tuning.md) | [<img src="./images/vibrations_example.png">](./macros/vibrations_tuning.md) |
## Additional macros
### AXES_MAP_CALIBRATION
All graphs generated by this package show plots based on accelerometer measurements, typically labeled with the X, Y, and Z axes. It's important to note that if the accelerometer is rotated, its axes may not align correctly with the machine axes, making the plots more difficult to interpret, analyze, and understand. The `AXES_MAP_CALIBRATION` is designed to automatically measure the alignement of the accelerometer in order to set it correctly.
> **Note**:
>
> This misalignment doesn't affect the measurements because the total sum across all axes is used to set the input shaper filters. It's just an optional but convenient way to configure Klipper's `[adxl345]` (or whichever accelerometer you have) "axes_map" parameter.
Here are the parameters available when calling this macro:
| parameters | default value | description |
|-----------:|---------------|-------------|
|Z_HEIGHT|20|z height to put the toolhead before starting the movements. Be careful, if your accelerometer is mounted under the nozzle, increase it to avoid crashing it on the bed of the machine|
|SPEED|80|speed of the toolhead in mm/s for the movements|
|ACCEL|1500 (or max printer accel)|accel in mm/s^2 used for all the moves|
|TRAVEL_SPEED|120|speed in mm/s used for all the travels moves|
|ACCEL_CHIP|"adxl345"|accelerometer chip name in the config|
The machine will move slightly in +X, +Y, and +Z, and output in the console: `Detected axes_map: -z,y,x`.
Use this value in your `printer.cfg` config file:
```
[adxl345] # replace "adxl345" by your correct accelerometer name
axes_map: -z,y,x
```
### EXCITATE_AXIS_AT_FREQ
The `EXCITATE_AXIS_AT_FREQ` macro is particularly useful for troubleshooting mechanical vibrations or resonance issues. This macro allows you to maintain a specific excitation frequency for a set duration, enabling hands-on diagnostics. By touching different components during the excitation, you can identify the source of the vibration, as contact usually stops it.
Here are the parameters available when calling this macro:
| parameters | default value | description |
|-----------:|---------------|-------------|
|FREQUENCY|25|excitation frequency (in Hz) that you want to maintain. Usually, it's the frequency of a peak on one of the graphs|
|TIME|10|time in second to maintain this excitation|
|AXIS|x|axis you want to excitate. Can be set to either "x", "y", "a", "b"|
## Complementary ressources
- [Sineos post](https://klipper.discourse.group/t/interpreting-the-input-shaper-graphs/9879) in the Klipper knowledge base - [Sineos post](https://klipper.discourse.group/t/interpreting-the-input-shaper-graphs/9879) in the Klipper knowledge base

BIN
docs/banner_long.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 740 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 204 KiB

After

Width:  |  Height:  |  Size: 247 KiB

View File

@@ -13,16 +13,16 @@ When a 3D printer moves, the motors apply some force to move the toolhead along
## Generalities on the graphs ## Generalities on the graphs
When tuning Input Shaper, keep the following in mind: When tuning Input Shaper, keep the following in mind:
1. **Focus on the shape of the graphs, not the exact numbers**. There could be differences between ADXL boards or even printers, so there is no specific "target" value. This means that you shouldn't expect to get the same graphs between different printers, even if they are similar in term of brand, parts, size and assembly. 1. **Focus on the shape of the graphs, not the exact numbers**. There could be differences between accelerometer boards or even printers, so there is no specific "target" value. This means that you shouldn't expect to get the same graphs between different printers, even if they are similar in term of brand, parts, size and assembly.
1. Small differences between consecutive test runs are normal, as ADXL quality and sensitivity is quite variable between boards. 1. Small differences between consecutive test runs are normal, as accelerometer quality and sensitivity is quite variable between boards.
1. Perform the tests when the machine is heat-soaked and close to printing conditions, as the temperature will impact the machine components such as belt tension or even the frame that is known to expand a little bit. 1. Perform the tests when the machine is heat-soaked and close to printing conditions, as the temperature will impact the machine components such as belt tension or even the frame that is known to expand a little bit.
1. Avoid running the toolhead fans during the tests, as they introduce unnecessary noise to the graphs, making them harder to interpret. This means that even if you should heatsoak the printer, you should also refrain from activating the hotend heater during the test, as it will also trigger the hotend fan. However, as a bad fan usually introduce some vibrations, you can use the test to diagnose an unbalanced fan as seen in the [Examples of Input Shaper graphs](./macros/axis_tuning.md) section. 1. Avoid running the toolhead fans during the tests, as they introduce unnecessary noise to the graphs, making them harder to interpret. This means that even if you should heatsoak the printer, you should also refrain from activating the hotend heater during the test, as it will also trigger the hotend fan. However, as a bad fan usually introduce some vibrations, you can use the test to diagnose an unbalanced fan as seen in the [Examples of Input Shaper graphs](./macros/axis_tuning.md) section.
1. Ensure the accuracy of your ADXL measurements by running a `MEASURE_AXES_NOISE` test and checking that the result is below 100 for all axes. If it's not, check your ADXL board and wiring before continuing. 1. Ensure the accuracy of your accelerometer measurements by running a `MEASURE_AXES_NOISE` test and checking that the result is below 100 for all axes. If it's not, check your accelerometer board and wiring before continuing.
1. The graphs can only show symptoms of possible problems and in different ways. Those symptoms can sometimes suggest causes, but they rarely pinpoint the exact issues. For example, while you may be able to diagnose that some screws are not tightened properly, you will unlikely find which exact screw is problematic using only these tests. You will most always need to tinker and experiment. 1. The graphs can only show symptoms of possible problems and in different ways. Those symptoms can sometimes suggest causes, but they rarely pinpoint the exact issues. For example, while you may be able to diagnose that some screws are not tightened properly, you will unlikely find which exact screw is problematic using only these tests. You will most always need to tinker and experiment.
1. Finally, remember why you're running these tests: to get clean prints. Don't become too obsessive over perfect graphs, as the last bits of optimization will probably have the least impact on the printed parts in terms of ringing and ghosting. 1. Finally, remember why you're running these tests: to get clean prints. Don't become too obsessive over perfect graphs, as the last bits of optimization will probably have the least impact on the printed parts in terms of ringing and ghosting.
### Special note on accelerometer (ADXL) mounting point ### Special note on accelerometer mounting point
Input Shaping algorithms work by suppressing a single resonant frequency (or a range around a single resonant frequency). When setting the filter, **the primary goal is to target the resonant frequency of the toolhead and belts system** (see the [theory behind it](#theory-behind-it)), as this has the most significant impact on print quality and is the root cause of ringing. Input Shaping algorithms work by suppressing a single resonant frequency (or a range around a single resonant frequency). When setting the filter, **the primary goal is to target the resonant frequency of the toolhead and belts system** (see the [theory behind it](#theory-behind-it)), as this has the most significant impact on print quality and is the root cause of ringing.
When setting up Input Shaper, it is important to consider the accelerometer mounting point. There are mainly two possibilities, each with its pros and cons: When setting up Input Shaper, it is important to consider the accelerometer mounting point. There are mainly two possibilities, each with its pros and cons:

View File

@@ -11,7 +11,6 @@ Then, call the `AXES_SHAPER_CALIBRATION` macro and look for the graphs in the re
| parameters | default value | description | | parameters | default value | description |
|-----------:|---------------|-------------| |-----------:|---------------|-------------|
|VERBOSE|1|Wether to log things in the console|
|FREQ_START|5|Starting excitation frequency| |FREQ_START|5|Starting excitation frequency|
|FREQ_END|133|Maximum excitation frequency| |FREQ_END|133|Maximum excitation frequency|
|HZ_PER_SEC|1|Number of Hz per seconds for the test| |HZ_PER_SEC|1|Number of Hz per seconds for the test|
@@ -40,10 +39,10 @@ For setting your Input Shaping filters, rely on the auto-computed values display
* `2HUMP_EI` and `3HUMP_EI` are last-resort choices. Usually, they lead to a high level of smoothing in order to suppress the ringing while also using relatively low acceleration values. If they pop up as suggestions, it's likely your machine has underlying mechanical issues (that lead to pretty bad or "wide" graphs). * `2HUMP_EI` and `3HUMP_EI` are last-resort choices. Usually, they lead to a high level of smoothing in order to suppress the ringing while also using relatively low acceleration values. If they pop up as suggestions, it's likely your machine has underlying mechanical issues (that lead to pretty bad or "wide" graphs).
- **Recommended Acceleration** (`accel<=...`): This isn't a standalone figure. It's essential to also consider the `vibr` and `sm` values as it's a compromise between the three. They will give you the percentage of remaining vibrations and the smoothing after Input Shaping, when using the recommended acceleration. Nothing will prevent you from using higher acceleration values; they are not a limit. However, when doing so, Input Shaping may not be able to suppress all the ringing on your parts. Finally, keep in mind that high acceleration values are not useful at all if there is still a high level of remaining vibrations: you should address any mechanical issues first. - **Recommended Acceleration** (`accel<=...`): This isn't a standalone figure. It's essential to also consider the `vibr` and `sm` values as it's a compromise between the three. They will give you the percentage of remaining vibrations and the smoothing after Input Shaping, when using the recommended acceleration. Nothing will prevent you from using higher acceleration values; they are not a limit. However, when doing so, Input Shaping may not be able to suppress all the ringing on your parts. Finally, keep in mind that high acceleration values are not useful at all if there is still a high level of remaining vibrations: you should address any mechanical issues first.
- **The remaining vibrations** (`vibr`): This directly correlates with ringing. It correspond to the total value of the blue "after shaper" signal. Ideally, you want a filter with minimal or zero vibrations. - **The remaining vibrations** (`vibr`): This directly correlates with ringing. It correspond to the total value of the blue "after shaper" signal. Ideally, you want a filter with minimal or zero vibrations.
- **Shaper recommendations**: This script will give you two recommandation. Pick the one that suit your needs: - **Shaper recommendations**: This script will give you some tailored recommendations based on your graphs. Pick the one that suit your needs:
* The first is Klipper's original suggestion, for best performance and acceleration on your machine while also allowing a little bit of remaining vibrations. * The "performance" shaper is Klipper's original suggestion that is good for high acceleration while also sometimes allowing a little bit of remaining vibrations. Use it if your goal is speed printing and you don't care much about some remaining ringing.
* The second aims for no remaining vibration to ensure the best print quality. * The "low vibration" shaper aims for the lowest level of remaining vibration to ensure the best print quality with minimal ringing. This should be the best bet for most users.
- The final line provides the estimated damping ratio for the axis. This value is generated automatically and is only accurate if the graph displays a clear and well detached single peak. * Sometimes, only a single recommendation called "best" shaper is presented. This means that either no suitable "low vibration" shaper was found (due to a high level of vibration or with too much smoothing) or because the "performance" shaper is also the one with the lowest vibration level.
- **Damping Ratio**: Displayed at the end, this estimatation is only reliable when the graph shows a distinct, standalone and clean peak. On a well tuned machine, setting the damping ratio (instead of Klipper's 0.1 default value) can further reduce the ringing at high accelerations and with higher square corner velocities. - **Damping Ratio**: Displayed at the end, this estimatation is only reliable when the graph shows a distinct, standalone and clean peak. On a well tuned machine, setting the damping ratio (instead of Klipper's 0.1 default value) can further reduce the ringing at high accelerations and with higher square corner velocities.
Then, add to your configuration: Then, add to your configuration:
@@ -112,7 +111,7 @@ Such graph patterns can arise from various factors, and there isn't a one-size-f
### Problematic CANBUS speed ### Problematic CANBUS speed
Using CANBUS toolheads with an integrated ADXL chip can sometimes pose challenges if the CANBUS speed is set too low. While users might lower the bus speed to fix Klipper's timing errors, this change will also affect input shaping measurements. An example outcome of a low bus speed is the following graph that, though generally well-shaped, appears jagged and spiky throughout. Additional low-frequency energy might also be present. For optimal ADXL board operation on your CANBUS toolhead, a speed setting of 500k is the minimum, but 1M is advisable. Using CANBUS toolheads with an integrated accelerometer chip can sometimes pose challenges if the CANBUS speed is set too low. While users might lower the bus speed to fix Klipper's timing errors, this change will also affect input shaping measurements. An example outcome of a low bus speed is the following graph that, though generally well-shaped, appears jagged and spiky throughout. Additional low-frequency energy might also be present. For optimal accelerometer board operation on your CANBUS toolhead, a speed setting of 500k is the minimum, but 1M is advisable.
| CANBUS problem present | CANBUS problem solved | | CANBUS problem present | CANBUS problem solved |
| --- | --- | | --- | --- |

View File

@@ -11,7 +11,6 @@ Then, call the `BELTS_SHAPER_CALIBRATION` macro and look for the graphs in the r
| parameters | default value | description | | parameters | default value | description |
|-----------:|---------------|-------------| |-----------:|---------------|-------------|
|VERBOSE|1|Wether to log things in the console|
|FREQ_START|5|Starting excitation frequency| |FREQ_START|5|Starting excitation frequency|
|FREQ_END|133|Maximum excitation frequency| |FREQ_END|133|Maximum excitation frequency|
|HZ_PER_SEC|1|Number of Hz per seconds for the test| |HZ_PER_SEC|1|Number of Hz per seconds for the test|

View File

@@ -15,8 +15,8 @@ Call the `VIBRATIONS_CALIBRATION` macro with the direction and speed range you w
|-----------:|---------------|-------------| |-----------:|---------------|-------------|
|SIZE|60|size in mm of the area where the movements are done| |SIZE|60|size in mm of the area where the movements are done|
|DIRECTION|"XY"|direction vector where you want to do the measurements. Can be set to either "XY", "AB", "ABXY", "A", "B", "X", "Y", "Z", "E"| |DIRECTION|"XY"|direction vector where you want to do the measurements. Can be set to either "XY", "AB", "ABXY", "A", "B", "X", "Y", "Z", "E"|
|Z_HEIGHT|20|z height to put the toolhead before starting the movements. Be careful, if your ADXL is under the nozzle, increase it to avoid a crash of the ADXL on the bed of the machine| |Z_HEIGHT|20|z height to put the toolhead before starting the movements. Be careful, if your accelerometer is mounted under the nozzle, increase it to avoid crashing it on the bed of the machine|
|VERBOSE|1|Wether to log the current speed in the console| |ACCEL|3000 (or max printer accel)|accel in mm/s^2 used for all the moves. Try to keep it relatively low to avoid bad oscillations that affect the measurements, but but high enough to reach constant speed for >~70% of the segments|
|MIN_SPEED|20|minimum speed of the toolhead in mm/s for the movements| |MIN_SPEED|20|minimum speed of the toolhead in mm/s for the movements|
|MAX_SPEED|200|maximum speed of the toolhead in mm/s for the movements| |MAX_SPEED|200|maximum speed of the toolhead in mm/s for the movements|
|SPEED_INCREMENT|2|speed increments of the toolhead in mm/s between every movements| |SPEED_INCREMENT|2|speed increments of the toolhead in mm/s between every movements|
@@ -32,7 +32,7 @@ Call the `VIBRATIONS_CALIBRATION` macro with the direction and speed range you w
These graphs essentially depict the behavior of the motor control on your machine. While there isn't much room for easy adjustments to enhance them, most of you should only utilize them to configure your slicer profile to avoid problematic speeds. These graphs essentially depict the behavior of the motor control on your machine. While there isn't much room for easy adjustments to enhance them, most of you should only utilize them to configure your slicer profile to avoid problematic speeds.
However, if you want to go the rabbit hole, as the data in these graphs largely hinges on the type of motors and their physical characteristic and their control by the TMC black magic, there are opportunities for optimization. Tweaking TMC parameters allow to adjust the peaks, enhance machine performance, or diminish overall machine noise. For this process, I recommend to directly use the [Klipper TMC Autotune](https://github.com/andrewmcgr/klipper_tmc_autotune) plugin, which should simplify everything considerably. But keep in mind that it's still an experimental plugin and it's not perfect. However, if you want to go the rabbit hole, as the data in these graphs largely hinges on the type of motors, their physical characteristic and the way they are controlled by the TMC drivers black magic, there are opportunities for optimization. Tweaking TMC parameters allow to adjust the peaks, enhance machine performance, or diminish overall machine noise. For this process, I recommend to directly use the [Klipper TMC Autotune](https://github.com/andrewmcgr/klipper_tmc_autotune) plugin, which should simplify everything considerably. But keep in mind that it's still an experimental plugin and it's not perfect.
For individuals inclined to reach the bottom of the rabbit hole and that want to handle this manually, the use of an oscilloscope is mandatory. Majority of the necessary resources are available directly on the Trinamics TMC website: For individuals inclined to reach the bottom of the rabbit hole and that want to handle this manually, the use of an oscilloscope is mandatory. Majority of the necessary resources are available directly on the Trinamics TMC website:
1. You should first consult the datasheet specific to your TMC model for guidance on parameter names and their respective uses. 1. You should first consult the datasheet specific to your TMC model for guidance on parameter names and their respective uses.

View File

@@ -2,7 +2,9 @@
USER_CONFIG_PATH="${HOME}/printer_data/config" USER_CONFIG_PATH="${HOME}/printer_data/config"
KLIPPER_PATH="${HOME}/klipper" KLIPPER_PATH="${HOME}/klipper"
K_SHAKETUNE_PATH="${HOME}/klippain_shaketune" K_SHAKETUNE_PATH="${HOME}/klippain_shaketune"
K_SHAKETUNE_VENV_PATH="${HOME}/klippain_shaketune-env"
set -eu set -eu
export LC_ALL=C export LC_ALL=C
@@ -14,19 +16,17 @@ function preflight_checks {
exit -1 exit -1
fi fi
if ! command -v python3 &> /dev/null; then
echo "[ERROR] Python 3 is not installed. Please install Python 3 to use the Shake&Tune module!"
exit -1
fi
if [ "$(sudo systemctl list-units --full -all -t service --no-legend | grep -F 'klipper.service')" ]; then if [ "$(sudo systemctl list-units --full -all -t service --no-legend | grep -F 'klipper.service')" ]; then
printf "[PRE-CHECK] Klipper service found! Continuing...\n\n" printf "[PRE-CHECK] Klipper service found! Continuing...\n\n"
else else
echo "[ERROR] Klipper service not found, please install Klipper first!" echo "[ERROR] Klipper service not found, please install Klipper first!"
exit -1 exit -1
fi fi
if [ -d "${HOME}/klippain_config" ]; then
if [ -f "${USER_CONFIG_PATH}/.VERSION" ]; then
echo "[ERROR] Klippain full installation found! Nothing is needed in order to use the K-Shake&Tune module!"
exit -1
fi
fi
} }
function check_download { function check_download {
@@ -48,9 +48,31 @@ function check_download {
fi fi
} }
function setup_venv {
if [ ! -d "${K_SHAKETUNE_VENV_PATH}" ]; then
echo "[SETUP] Creating Python virtual environment..."
python3 -m venv "${K_SHAKETUNE_VENV_PATH}"
else
echo "[SETUP] Virtual environment already exists. Continuing..."
fi
source "${K_SHAKETUNE_VENV_PATH}/bin/activate"
echo "[SETUP] Installing/Updating K-Shake&Tune dependencies..."
pip install --upgrade pip
pip install -r "${K_SHAKETUNE_PATH}/requirements.txt"
deactivate
printf "\n"
}
function link_extension { function link_extension {
echo "[INSTALL] Linking scripts to your config directory..." echo "[INSTALL] Linking scripts to your config directory..."
ln -frsn ${K_SHAKETUNE_PATH}/K-ShakeTune ${USER_CONFIG_PATH}/K-ShakeTune
if [ -d "${HOME}/klippain_config" ] && [ -f "${USER_CONFIG_PATH}/.VERSION" ]; then
echo "[INSTALL] Klippain full installation found! Linking module to the script folder of Klippain"
ln -frsn ${K_SHAKETUNE_PATH}/K-ShakeTune ${USER_CONFIG_PATH}/scripts/K-ShakeTune
else
ln -frsn ${K_SHAKETUNE_PATH}/K-ShakeTune ${USER_CONFIG_PATH}/K-ShakeTune
fi
} }
function link_gcodeshellcommandpy { function link_gcodeshellcommandpy {
@@ -76,6 +98,7 @@ printf "=============================================\n\n"
# Run steps # Run steps
preflight_checks preflight_checks
check_download check_download
setup_venv
link_extension link_extension
link_gcodeshellcommandpy link_gcodeshellcommandpy
restart_klipper restart_klipper

12
requirements.txt Normal file
View File

@@ -0,0 +1,12 @@
contourpy==1.2.0
cycler==0.12.1
fonttools==4.45.1
kiwisolver==1.4.5
matplotlib==3.8.2
numpy==1.26.2
packaging==23.2
Pillow==10.1.0
pyparsing==3.1.1
python-dateutil==2.8.2
scipy==1.11.4
six==1.16.0