Skip to content

Conversation

@ZubaeyrMSFT
Copy link
Collaborator

No description provided.

@ZubaeyrMSFT ZubaeyrMSFT changed the base branch from main to Model-Customization-Dev December 30, 2025 21:03
}

// UploadTrainingFile uploads and validates a training file
func (s *fineTuningServiceImpl) UploadTrainingFile(ctx context.Context, filePath string) (string, error) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have only one func UploadFile

}
defer file.Close()

uploadedFile, err := p.client.Files.New(ctx, openai.FileNewParams{
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add spinner code here

return "", fmt.Errorf("uploaded file is empty")
}

return uploadedFile.ID, nil
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

upload needs to wait for before retuning the file id otherwise fine tune job creation will fail.
Here is POC code
for {
f, err := client.Files.Get(ctx, uploadedFile.ID)
if err != nil {
_ = spinner.Stop(ctx)
return "", fmt.Errorf("\nfailed to check file status: %w", err)
}

		if f.Status == openai.FileObjectStatusProcessed {
			_ = spinner.Stop(ctx)
			break
		}

		if f.Status == openai.FileObjectStatusError {
			_ = spinner.Stop(ctx)
			return "", fmt.Errorf("\nfile processing failed with status: %s", f.Status)
		}

		fmt.Print(".")
		time.Sleep(2 * time.Second)
	}

}
}

//TODO Need to set hyperparameters, method, integrations
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need complete hypperparam mapping as well
from POC
// Set hyperparameters if provided
if config.Method.Type == "supervised" && config.Method.Supervised != nil {
hp := config.Method.Supervised.Hyperparameters
supervisedMethod := openai.SupervisedMethodParam{
Hyperparameters: openai.SupervisedHyperparameters{},
}

	if hp.BatchSize != nil {
		if batchSize := convertHyperparameterToInt(hp.BatchSize); batchSize != nil {
			supervisedMethod.Hyperparameters.BatchSize = openai.SupervisedHyperparametersBatchSizeUnion{
				OfInt: openai.Int(*batchSize),
			}
		}
	}

	if hp.LearningRateMultiplier != nil {
		if lr := convertHyperparameterToFloat(hp.LearningRateMultiplier); lr != nil {
			supervisedMethod.Hyperparameters.LearningRateMultiplier = openai.SupervisedHyperparametersLearningRateMultiplierUnion{
				OfFloat: openai.Float(*lr),
			}
		}
	}

	if hp.Epochs != nil {
		if epochs := convertHyperparameterToInt(hp.Epochs); epochs != nil {
			supervisedMethod.Hyperparameters.NEpochs = openai.SupervisedHyperparametersNEpochsUnion{
				OfInt: openai.Int(*epochs),
			}
		}
	}

	jobParams.Method = openai.FineTuningJobNewParamsMethod{
		Type:       "supervised",
		Supervised: supervisedMethod,
	}
} 

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants