Examples Generator#128
Conversation
…rating .ipynb notebooks
|
I think overall this could be useful to get examples of everything. We need to add some more tests. Things that I would like improved, and this isn't necessarily a you thing, as I could look into this as well.
when the user can just do
I do also like the more training examples that I have and keeping them separate. but these are also helpful for the more obscure calls. |
|
Test generation could also be automated, but that might be too much. Because at some point we are catering/managing the library manually and flirt the line of automating it all and losing that control.
I have it reading the resources file for the data_in/post body of the responses. But I do not have it reading them as outputs. Instead I was gunning on having the notebooks render the actual live output during the generation and having it store that raw output. But I was not sure if there was a way to store this generated output with the notebook prior to it being ran. I tried. Just left the
I think this is due to the tests themselves showing the imports this way. Here's one of the current tests I have it pulling the actual import strings out of each file and putting it at the top, then prepending this import as well.
If we change the way things are imported in the tests, it will fix in the output examples. Perhaps when we go to do item # 3 we could do this?
You are right, the order of the items in the tests is the order of the example output. I don't have it reserving comments, but we could output those to the examples too if we wanted. |
More PR that could come of this
5+ ???? Let me know if you approve of any of these and i'll start on them, and if you need any other changes to submit this PR |



Generating examples using the
nbformatlibrary to write notebook files from the mock tests files.This PR adds:
scripts/generate_examples.py.\tests\.directory filestests/timeseries/timeseries_test.pyto use real values for notebook testingGotchas
Going this route we would want to start using real data in our mock tests files. I.e. instead of using
Testas a project we would useKEYS/etc.This is an initial run through of the test generator, we may want to extend it to show more, change verbiage, or ignore certain method names.
Do we want to include this in a github action? I.e. create these files for the repo when a new test is added?
Personally i'd lean towards running this as needed.
Other thoughts
Should we add links in the pydoc for each function
Examples
Should we also generate generic non Jupyter Notebook files? i.e. the files here
To best view the notebooks below, click the 3 dots and "view file" to render it