AResCoN is a pipeline developed to perform accurate filtering of cells or nuclei after they are inferred by a neural network. It uses the output of Cellpose or Stardist and utilizes Fiji to obtain measurements that can be used either for X and Y axis cell filtering or Z-axis cell filtering too across a small range of planes.
AResCoN is particularly useful in setups where complete imaging of a 2D slice (one hemishpere or both) is required but it can also be used for isolated regions within a 2D slice.
-
A primary output of Cellpose (recommended) or Stardist (only for no z-axis filtering) is required. You can find notebooks for training/predictions on Cellpose github page. A notebook for predictions is also attached here.
-
The naming of the obtained ROIs must be similar to the ROIs provided in the repository, namely 001_001, 001_002, 001_003 and so on. Your ROIs will obtain by default these names if you run the label (mask) to ROI conversion option in AResCoN.
-
At present, AResCoN only works on Windows. Kindly ignore the Linux option.
-
AResCoN works optimally with 16-bit images and has been tested so far only with 20x magnified captures. 8-bit images and lower or higher magnifications are likely to work too, as long as your dataset is homogenous.
-
AResCoN has only be tested with .tif (and .tiff) images.
-
AResCon receives only one channel input at at time. If you have more than one channels, you should run each separately.
-
For a complete utilization of its features, it requires images from small multi-stacks (usually comprising up to 10 planes). Due to the (1) wiggly nature of a mounted tissue (2) the slightly different depth of cells and nuclei populating it (3) inconsistencies in slice thickness, the acquisition of several planes can secure at least one crisp capture of each cell/nucleus across many planes. AResCoN can filter out Cellpose predictions that are detected over separate planes and correspond to the same observation.
-
AResCoN has recently been fixed to accomodate multistack images comprising more than 9 planes. It is now also able to process multistack images comprising a different number of planes (e.g. a multistack of 6 planes and a multistack of 12 planes).
-
Please use the Fiji/ImageJ version provided in this repository. Do not update Fiji! Other versions will probably not work unless you adjust the number of tab presses in the pyautogui in all modules where fiji is called.
-
An installation of AutoHotKeysUX is necessary (https://www.autohotkey.com/ v2 and not the depracated one).
-
If your PC needs much time to open Fiji, it is likely that you will encounter errors during the initiation steps of Fiji (read the disclaimer below). Time the seconds needed for your Fiji to open and ensure that your Fiji opens in less than 13 seconds. If the minimum value of 13 seconds is not sufficient, you can change yourself this limit inside the code. Open all modules starting with the word Run -> (RunFijiMeasurements.py,RunReducedRois.py,RunMeanMeasurements.py,RunVisualTests.py) as well as the modules ConvertLabelsToRois.py and EntropyMeasurements.py, look for the command below and replace the value (or, the variable corresponding to a value) in the time.sleep(). In this case, your PC will wait 13 seconds after the initialisation of Fiji, to ensure that Fiji is fully loaded (the next version of AResCoN will allow you to insert your value into a designated field under the main tab).
subprocess.Popen([fiji_path])
time.sleep(13) Install Anaconda or Miniconda. Open the Anaconda prompt, locate the directory where the AResCoN_dependencies.yaml file is. Then type:
conda env create -f AResCoN_dependencies.yamlYou should always use the latest version of AResCoN code (see https://github.com/AngelosDid/AResCoN/releases). Only use the 1st release to download the appropriate Fiji version which AResCoN works with. All releases work based on this Fiji version.
If you are using Cellpose, ensure that you use the attached collab notebook which you can find inside Arescon\Extra info you might need\Colab Notebook (this is part of the arescon zip file).
After you install the environment, locate the main.py file and run the code. The GUI of AResCoN should open!
🆕 You can also try using the beta executable version of AResCoN. Navigate to https://archive.org/details/arescon-executable , download the zip file, open the dist folder and run the main.exe. After a few seconds, the GUI of AResCoN should open! Kindly keep in mind that the executable version hasn't been thoroughly tested.
Use the 'Stack to Images' ImageJ function and save separately each plane. Images must end to _plane1.tif, _plane2.tif and so on. Make sure that each plane includes only a single channel.
Alternatively, if your original images are multistack AND multichannel images, you can navigate the fiji macros folder inside AResCoN code and use the SeparateMultiplaneForBatchMacro.txt. You can copy its content, open Fiji and go to Process->Batch->Macro. Select as input the directory where all your multistack AND multichannel images are. When the first image opens, you will be requested to indicate a folder where each single-channel/single-plane image will be saved. Your directory has to be structured like this: directory/CN/planeN_image where N is the number of your channel (for CN) or plane (for planeN). Therefore, if you have 3 channels and 3 planes, you need to have a C1, C2 and C3 folder; inside of each a plane1_images, plane2_images, plane3_images folder. All CN and planeN directories must be empty (This macro will be incorporated to AResCoN in a new version).
Create a main Rois_Folder and then create as many planeN_Images subdirectories as the maximum plane that you have. For instance, if brain a consists of 4 planes and brain b of 5, then create 5 planeN subdirectories, where N corresponds to a number from 1 to 5. plane1_images must contain a_plane1.tif and b_plane1.tif , plane2_images must contain a_plane2.tif and b_plane2.tif and so on.
Alternatively you can :
cd "C:\Users\YourAccount\Desktop\ROIs folder"
del /s /a desktop.ini"Next: Run Cellpose inferences and paste each .tif mask output to the respective planeN subdirectory of a main ROIs folder. For instance, if there are 2 multistacks of brain a and brain b, both comprising 5 planes, plane1_ROIs must contain a_plane1.tif and b_plane1.tif, plane2_ROIs must contain a_plane2.tif and b_plane2.tif and so on.
The names of the .tif masks must be identical to the names of the initial _planeN.tif images. The provided Cellpose notebook will add an additional _predicted_mask to each mask output name (unless you remove it directly from the notebook). Remove this additional "_predicted_mask" part from all images. You can do this from cmd or much more easily with tools like PowerRename.
If you aren't using planes from small multistacks and are only interested in filtering single shots, place all your .tif output masks inside the plane1_Rois subdirectory of the main ROIs folder.
After you ensure that the architecture is correct, open the AResCoN code folder with VS code and run the main.py. A GUI window will open.
Click at the Microscopy tab and make sure that the number of zplanes corresponds to your maximum z-plane.
You can also adjust now the bit depth (recommended to use 16-bit images) and the system, which must be Windows.
Click at the Measure Rois tab, then click at the Find fiji.exe button and locate the ImageJ.exe file (NOT the fiji.exe) that is provided in the first release of this repository.
Then navigate to the ChangeROIs tab and click the 'Convert Labels to ROIs' button. You will be prompted to select the main ROI folder that contains all subdirectories comprising the mask .tif outputs at this stage.
Let Fiji run completely uninterrupted. It will open and close as many times as your maximum plane number.
At present, there is no messagebox appearing to inform you that the process is complete. Wait a few seconds and a new folder named RoisFromLabels will be created inside your main ROI folder, along with your subdirectories.
cd "C:\Users\YourAccount\Desktop\Images OR ROIs folder"
del /s /a desktop.ini"Create a main folder (e.g. Input) and paste your main Images Folder that contains all planeN subdirectories inside it. Locate the main ROIsFromLabels folder that includes all the newly created subdirectories that now contain .zip files instead of .tif masks.
You can find the RoisFromLabels folder inside the subdirectories of the main ROIs folder. Paste the RoisFromLabels folder inside your main Input folder. Preferably, rename the RoisFromLabels folder to Rois_folder.
In Fiji Menu bar type 'set measurements' and make sure the Area, Standard deviation, min and max grey value, Limit to threshold and Display Labels are selected (I personally have opted for more but you dont need them and it will cost time) :
In AResCoN, navigate to Main Inputs and press the 'Images Folder' Button. Select your main Image Folder which contains all planeN_images subdirectories. Then press 'Rois Folder' button and select the main ROIs folder which contains all planeN_Rois subdirectories. Both of these main folders must be inside the Input directory.
Then, navigate to Measure Rois tab. Click at the 'Save Folder' tab and locate the Input directory (that is, the parent directory of ROIs and Images folder). Create a third folder inside Input called 'save' or whatever else and select it.
Fiji.exe is already selected unless you have closed AReScoN before, in which case you have to select the ImageJ executable file again.
Read the instructions on the screen first and let Fiji run uninterrupted until it obtains all measurements. You will be notified once the procedure is completed. New csv measurements will have been created inside your planeN subdirectories in the save folder.
Go to the 'Add Metrics' Tab and click on Add measurements. You can also click on Save measurements after the procedure is finished. In both cases, you will be notified with a messagebox. Depending on the size of your files, this might take time. A (Not Responding) message on the title of the program does not indicate that AResCoN is not running properly, rather that it is still running.
Click on 'Find edges StdDev' and let Fiji run, like you did in step 6. You will be notified by a messagebox in the end and a new folder containing the new measurements based on FindEdges will be created inside the Input folder.
Click on Add edges StdDev. This will lead to the addition of a column in each one of your csv files inside the main folder. Then click again on 'Add measurements' to insert the new information to AResCoN.
Steps 7,8 and 9 in one image :
Find the 2D filter tab and type a ROI enlargement factor (I like a value of 2). For a value of 2, this factor means that you are taking as a background field the space of a double sized bounding box of each cell of yours. Press the 'Locate .ahk' button and select the autoenter.ahk inside the AResCoN code folder. After installing the AutoHotKeysUX, press the 'Locate' .exe button and select the executable file of AutoHotKeysUX.
Click on 'Get Roi-Background mean gray differences'. You will be asked to select an empty folder where Fiji will communicate with AResCoN during the process.
-
Eventually, images of convex hulls will remain in this folder. You can use them later as sanity checks. Each convex hull is a region created based on the external boundaries of the more distant cells from all directions, yet still inside your slice (and not any contingent misdetected ROIs. in the background of the slice). ROIs are 'burned' into each sanity check image so that you can see whether the convex hull has been created appropriately. Ignore erroneous ROIs that might be outside the convex hull and are derived by erroneous detections of Cellpose but be suspicious if many ROIs are outside the convex hull (which would mean that your slice was not successfuly contrasted to the background of the image).
-
If you convex hulls are not created properly, then you will have to tweak the code for threshold-based background detection that takes place inside the RunMeanMeasurements.py module. You can experiment with the different methods that Fiji offers, open macro recorder and see the underlying code so that you replace the default code below (Note that you don't have to reduce your particles to 1. As long as your slice is the largest particle after thresholding, the rest of the script will work fine).
setOption("ScaleConversions", true); run("Gaussian Blur...", "sigma=10"); run("Auto Threshold", "method=MinError(I) white");
Let Fiji run uninterruptedly like step 6. This step will take longer than step 6 and 8. How much longer? This really depends on the number of ROIs that each image has. It might take a couple of minutes -or more- per plane for images with many thousands of ROIs.
Eventually, two more columns will be added to each csv file inside the subdirectories of the save folder, where your main measurements are stored. The most important new column is the SurroundingMean, which is the relative background of each corresponding ROI. This background DOES NOT take into account pixel values of other ROIs falling under this enlarged region. It also does not take into account the real black background behind the tissue, which is adjacent to neurons that lay in the very outrer cortical parts. Hence, this is background is more reliable than other methods.
You can make your own ROI filters by inserting conditions based on the metrics/measurements that are included in your csv files. For instance, you can exclude all ROIs that are less than 30% brighter than their background by typing (Mean>SurroundingMean*1.3). You can include any other condition (always inside parenthesis) that you want, as long as you don't repeat the name of a metric inside the same condition (you can still factorize if you want). If any of the conditions is violated, the ROI will be filtered out.
💡 IMPORTANT : The ROIs that displayed NaN value during the calculation of the relative background have artificially been given the value 0.000001 under the SurroundingMean column. These are mostly false ROIs detected outside the tissue, somewhere in the black background of the image. It is highly recommended to also add the (SurroundingMean>0.000001) condition in the filters, to ensure that no such ROIs will be included in your final set.
There is a catch here though: When we converted masks to ROIs (step3) any contingent spatial gap between overlapping ROIs is now lost. This means that if a neurons is completely surrounded by other neurons throughout the whole range of its enlarged form (see step 10), then there will be no unmasked pixel values to calculate the SurroundingMean (unless the enlargment factor had created a space that goes beyond the adjacent cells and included at least one free pixel). This will erroneously lead to a 0.000001 value. This is naturally almost impossible, however, Cellpose makes some really false predictions from time to time which occupy a very large space in the image. The indicated by the red arrow cell in the example below will be 'trapped' and might erroneously acquire a NaN -> 0.000001 value.
🆕 The new Cellpose_inferences_notebook.ipynb (uploaded on 26 February 2026 or later) filters rois based on minimum a size (25 pixels) which you can change, as well as based on an arbitrary maximum size, defined as 3 times the average size of all detected ROIs. This ensures removal of strange large ROIs. However, you should keep applying the (SurroundingMean>0.000001) to ensure that ROIs outside your tissue are rejected.
Click on 'Apply filters' and you will be prompted to select an empty folder where two directories will be saved, namely the accepted and the rejected. The rejected folder comprises as many directories as your filtering conditions, so that you can see which condition was violated by a particular ROI.
Make sure you agree to create a .pkl file. You will later need this for Zfiltering.
Navigate to Zfilter tab and press 'Apply filters'. Then select your newly created Accepted_Rois.pkl file. The default values should work well for a precise detection of the crispest version of a neuron (the plane of origin is indicated in the end of the ROI name, e.g. 001_132-4. '-4' indicates that the selected crispest 'version' of the neuron originates from plane4. The vast majority of cells or nuclei should be detected correct.
So far, I have detected a few negligible cases where some false positives survive the filtering, thereby leading to erroneous 'double' detection of a neuron. You can easily investigate yourself by overlaying the rois in the image. Notwithstanding that there will probably be no reason for tweaking the filtering values, you can refer to the decisiontree.pptx inside the AResCoN code and see which value corresponds to each step of the designed algorithm. I will provide a more descriptive explanation about this in the future.
For each zip file created, you will also find a respective excel file that lists the accepted and rejected ROIs of z-filtering. The node, as well as the condition where the inclusion or exclusion took place is also indicated. You can navigate to the project folder and find the decision trees for nuclear and cellular signal. Use the nodes of the nuclear decision tree to understand the exact condition that led to exclusion or inclusion of a ROI. Of note, an additional recursive function for further removal of ROIs engulfing smaller ROIs is applied for nuclear signal.
The final zip file contains ROIs that originate from all possible planes, maintaining only the crispest version of each neuron, thereby allowing you to capture all neurons that could possibly be visible throughout your whole tissue!
https://youtu.be/fQPcCw8j4ls?t=5792
AResCoN has been created solely by Angelos Didachos, a 4th-year PhD Candidate in Neuroscience. I am finishing my PhD in a few months and then actively looking for a job in Australia. Kindly cite my work if you are planning to use AResCoN by citing the github page. If you are interested in detection of animal freezing behaviors, check out my other repository, namely EasyFreezy, which utilizes Deeplabcut output to detect reliably freezing spans.









