Skip to content

Conversation

@CherukuriPavanKumar
Copy link
Collaborator

No description provided.

Copilot AI review requested due to automatic review settings January 2, 2026 08:23
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR integrates a Farm Vaidya voice assistant on the home page, enabling real-time voice conversations with an AI farming expert through a floating widget interface.

  • Adds a voice agent component using Daily.co's WebRTC infrastructure for voice communication
  • Integrates UI components, hooks, and styling for the voice interface
  • Configures Pipecat API endpoint for AI conversation backend

Reviewed changes

Copilot reviewed 11 out of 13 changed files in this pull request and generated 14 comments.

Show a summary per file
File Description
src/components/VoiceAgent.tsx Main voice assistant component with Daily.co integration, call controls, and UI
src/components/ui/button.tsx Reusable button component with multiple variants and sizes
src/components/ui/card.tsx Card components for structured content display
src/components/ui/tooltip.tsx Placeholder tooltip components (no actual tooltip functionality)
src/components/ui/toaster.tsx Placeholder toaster component (delegates to sonner)
src/hooks/use-toast.ts Custom hook wrapping sonner toast library for notifications
src/lib/utils.ts Utility function combining clsx and tailwind-merge for class names
src/app/voice-agent.css CSS animations for voice agent UI (pulse, wave, bounce effects)
src/app/page.tsx Integrates VoiceAgent component and Toaster into home page
package.json Adds dependencies: @daily-co/daily-js, sonner, lucide-react, clsx, tailwind-merge
package-lock.json Lock file updates for new dependencies
public/Farm-vaidya-icon.png Icon asset for the voice agent
VOICE_AGENT_INTEGRATION.md Integration documentation and setup instructions

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

console.log("API Key provided:", !!apiKey);

if (!apiKey) {
throw new Error("VITE_PIPECAT_TOKEN is not configured in .env file");
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error message references 'VITE_PIPECAT_TOKEN' but this is a Next.js project that uses 'NEXT_PUBLIC_' prefix. The error message should reference 'NEXT_PUBLIC_PIPECAT_TOKEN' to match the actual environment variable being used.

Suggested change
throw new Error("VITE_PIPECAT_TOKEN is not configured in .env file");
throw new Error("NEXT_PUBLIC_PIPECAT_TOKEN is not configured in .env file");

Copilot uses AI. Check for mistakes.
description,
});
} else {
sonnerToast.success(title, {
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'default' toast variant is displayed with sonnerToast.success() which may not be semantically correct for all default messages. Consider using sonnerToast() or sonnerToast.info() for truly neutral/default notifications, reserving success for actual success states.

Suggested change
sonnerToast.success(title, {
sonnerToast(title, {

Copilot uses AI. Check for mistakes.
const [transcript, setTranscript] = useState<TranscriptMessage[]>([]);
const [callFrame, setCallFrame] = useState<any>(null);
const [timer, setTimer] = useState(0);
const [inputText, setInputText] = useState("");
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'inputText' state variable is declared but never displayed or used in the UI. Since 'handleSendMessage' is also unused, this state should be removed to avoid confusion and reduce unnecessary component state.

Suggested change
const [inputText, setInputText] = useState("");

Copilot uses AI. Check for mistakes.
Comment on lines +22 to +81
const [transcript, setTranscript] = useState<TranscriptMessage[]>([]);
const [callFrame, setCallFrame] = useState<any>(null);
const [timer, setTimer] = useState(0);
const [inputText, setInputText] = useState("");
const { toast } = useToast();
const connectLockRef = useRef(false);
const transcriptEndRef = useRef<HTMLDivElement>(null);
const timerRef = useRef<NodeJS.Timeout | null>(null);

// Auto-scroll transcript to bottom
useEffect(() => {
transcriptEndRef.current?.scrollIntoView({ behavior: "smooth" });
}, [transcript]);

// Auto-connect when opening
useEffect(() => {
if (isOpen && !isConnected && !isConnecting) {
startConversation();
}
}, [isOpen]);

// Timer logic
useEffect(() => {
if (isConnected) {
timerRef.current = setInterval(() => {
setTimer((prev) => prev + 1);
}, 1000);
} else {
if (timerRef.current) {
clearInterval(timerRef.current);
}
setTimer(0);
}
return () => {
if (timerRef.current) {
clearInterval(timerRef.current);
}
};
}, [isConnected]);

const formatTime = (seconds: number) => {
const hrs = Math.floor(seconds / 3600);
const mins = Math.floor((seconds % 3600) / 60);
const secs = seconds % 60;
return `${hrs.toString().padStart(2, "0")}:${mins
.toString()
.padStart(2, "0")}:${secs.toString().padStart(2, "0")}`;
};

const addToTranscript = (speaker: "user" | "ai", text: string) => {
setTranscript((prev) => [
...prev,
{
id: Math.random().toString(36).substring(7),
speaker,
text,
timestamp: new Date(),
},
]);
};
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'transcript' state and 'addToTranscript' function are defined but the transcript is never displayed in the UI. Either implement a UI to show the transcript or remove this unused functionality to simplify the component.

Copilot uses AI. Check for mistakes.
Comment on lines +28 to +35
const transcriptEndRef = useRef<HTMLDivElement>(null);
const timerRef = useRef<NodeJS.Timeout | null>(null);

// Auto-scroll transcript to bottom
useEffect(() => {
transcriptEndRef.current?.scrollIntoView({ behavior: "smooth" });
}, [transcript]);

Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'transcriptEndRef' is defined and used in a useEffect for auto-scrolling, but since there's no transcript UI rendered, this ref and its associated useEffect (lines 32-34) are not serving any purpose and should be removed.

Suggested change
const transcriptEndRef = useRef<HTMLDivElement>(null);
const timerRef = useRef<NodeJS.Timeout | null>(null);
// Auto-scroll transcript to bottom
useEffect(() => {
transcriptEndRef.current?.scrollIntoView({ behavior: "smooth" });
}, [transcript]);
const timerRef = useRef<NodeJS.Timeout | null>(null);

Copilot uses AI. Check for mistakes.
Comment on lines +185 to +194
})
.on("active-speaker-change", (e: any) => {
const localParticipant = frame.participants().local;
if (e.activeSpeaker && e.activeSpeaker.peerId === localParticipant.user_id) {
// User is speaking
} else if (e.activeSpeaker) {
// AI is speaking
} else {
// No one is speaking
}
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'active-speaker-change' event handler (lines 186-195) contains empty conditional blocks that don't perform any actions. Either implement the intended functionality for tracking when users/AI are speaking, or remove this unused event listener.

Suggested change
})
.on("active-speaker-change", (e: any) => {
const localParticipant = frame.participants().local;
if (e.activeSpeaker && e.activeSpeaker.peerId === localParticipant.user_id) {
// User is speaking
} else if (e.activeSpeaker) {
// AI is speaking
} else {
// No one is speaking
}

Copilot uses AI. Check for mistakes.
Comment on lines +239 to +250
// @ts-ignore - Used for future text input functionality
const handleSendMessage = () => {
if (!inputText.trim()) return;
addToTranscript("user", inputText);
setInputText("");
// Here you would typically send the text to the AI if supported by the backend

// Simulate AI response for testing
setTimeout(() => {
addToTranscript("ai", "I am a mock bot response. The API is bypassed for testing.");
}, 1000);
};
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 'handleSendMessage' function is marked with a ts-ignore comment indicating it's unused, but it contains actual implementation code including a mock response. Either remove this unused function entirely or remove the ts-ignore if it will be used in the future. Dead code should not be left in production.

Copilot uses AI. Check for mistakes.
if (isOpen && !isConnected && !isConnecting) {
startConversation();
}
}, [isOpen]);
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The useEffect hook at line 37 has missing dependencies that could cause stale closures. The dependency array should include 'startConversation', 'isConnected', and 'isConnecting' to ensure the effect runs with the latest values. Consider wrapping 'startConversation' in useCallback to avoid infinite loops.

Suggested change
}, [isOpen]);
}, [isOpen, isConnected, isConnecting, startConversation]);

Copilot uses AI. Check for mistakes.
const endpoint = process.env.NEXT_PUBLIC_PIPECAT_ENDPOINT || "https://api.pipecat.daily.co/v1/public/webagent/start";
const apiKey = process.env.NEXT_PUBLIC_PIPECAT_TOKEN;
console.log("Connecting to Pipecat endpoint:", endpoint);
console.log("API Key provided:", !!apiKey);
Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Environment variables are logged to console which could expose sensitive information in production. Consider removing or protecting these console.log statements, especially the one that logs whether an API key is present.

Suggested change
console.log("API Key provided:", !!apiKey);

Copilot uses AI. Check for mistakes.
Comment on lines +90 to +102
/*
// Simulate API delay
await new Promise(resolve => setTimeout(resolve, 1000));

setIsConnected(true);
setIsConnecting(false);
connectLockRef.current = false;
// toast({ title: "Connected", description: "Farm Vaidya is listening (Test Mode)" });

// Simulate bot greeting
addToTranscript("ai", "Namaste! I am Farm Vaidya. How can I help you with your crops today?");
*/

Copy link

Copilot AI Jan 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The commented-out test/simulation code (lines 90-101) should be removed before merging to production. Dead code and commented blocks make the codebase harder to maintain and can cause confusion.

Suggested change
/*
// Simulate API delay
await new Promise(resolve => setTimeout(resolve, 1000));
setIsConnected(true);
setIsConnecting(false);
connectLockRef.current = false;
// toast({ title: "Connected", description: "Farm Vaidya is listening (Test Mode)" });
// Simulate bot greeting
addToTranscript("ai", "Namaste! I am Farm Vaidya. How can I help you with your crops today?");
*/

Copilot uses AI. Check for mistakes.
@ManasMalla ManasMalla merged commit bb01b21 into main Jan 2, 2026
1 check failed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants