Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: ztjhz/BetterChatGPT
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: main
Choose a base ref
...
head repository: AI-Maria/BetterChatGPT
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: main
Choose a head ref
Able to merge. These branches can be automatically merged.

Commits on Nov 26, 2023

  1. first try that works

    lectrician1 committed Nov 26, 2023
    Copy the full SHA
    313cfc8 View commit details

Commits on Feb 19, 2024

  1. Copy the full SHA
    95d4405 View commit details
  2. Copy the full SHA
    3a91653 View commit details
  3. add modelTypes object and check that image is supported for that mode…

    …l when adding image selector
    lectrician1 committed Feb 19, 2024
    Copy the full SHA
    a58c791 View commit details
  4. Copy the full SHA
    9bdaaea View commit details

Commits on May 29, 2024

  1. Create .env

    AI-Maria authored May 29, 2024
    Copy the full SHA
    f2d821a View commit details
  2. Change info

    AI-Maria authored May 29, 2024
    Copy the full SHA
    601117f View commit details
  3. Change info

    AI-Maria authored May 29, 2024
    Copy the full SHA
    87f169b View commit details
  4. Chenge info

    AI-Maria committed May 29, 2024
    Copy the full SHA
    4c768ff View commit details
  5. fix

    AI-Maria committed May 29, 2024
    Copy the full SHA
    44d7c29 View commit details

Commits on May 30, 2024

  1. Models to BW

    AI-Maria committed May 30, 2024
    Copy the full SHA
    1e9d3d0 View commit details
  2. Return deleted yarn.lock

    AI-Maria committed May 30, 2024
    Copy the full SHA
    49afe16 View commit details
  3. Change defaults

    AI-Maria committed May 30, 2024
    Copy the full SHA
    53aeb01 View commit details
  4. Added docker compose instructions (#548)

    * Added docker compose instructions
    
    * Changed the default port to 5173
    darrenchang authored and AI-Maria committed May 30, 2024
    Copy the full SHA
    e611626 View commit details
  5. Add Vietnamese translation (#547)

    * Add files via upload
    
    * Update main.json
    
    * Update model.json
    
    * Update model.json
    
    * Update api.json
    huanroseso3 authored and AI-Maria committed May 30, 2024
    Copy the full SHA
    21e54ad View commit details
  6. Added support for "gpt-4-turbo-2024-04-09" and "gpt-4-turbo" models. (#…

    …572)
    
    * Update chat.ts
    
    Added support for "gpt-4-turbo-2024-04-09" and "gpt-4-turbo" models.
    
    * Update chat.ts
    
    Added support for "gpt-4-turbo-2024-04-09" and "gpt-4-turbo" models.
    ChenYuZe519 authored and AI-Maria committed May 30, 2024
    Copy the full SHA
    9fdcc52 View commit details
  7. Bump v1.0.5

    ztjhz authored and AI-Maria committed May 30, 2024
    Copy the full SHA
    85734d0 View commit details
  8. Add gpt-4o (#585)

    * Add gpt-4o to chat.ts
    
    * Add gpt-4o to chat.ts
    quasimik authored and AI-Maria committed May 30, 2024
    Copy the full SHA
    a45297e View commit details
  9. Create .env

    AI-Maria committed May 30, 2024
    Copy the full SHA
    af04044 View commit details
  10. Change info

    AI-Maria committed May 30, 2024
    Copy the full SHA
    0fbeb71 View commit details
  11. Change info

    AI-Maria committed May 30, 2024
    Copy the full SHA
    eee1114 View commit details
  12. Chenge info

    AI-Maria committed May 30, 2024
    Copy the full SHA
    cca46be View commit details
  13. fix

    AI-Maria committed May 30, 2024
    Copy the full SHA
    a0a13f8 View commit details
  14. Models to BW

    AI-Maria committed May 30, 2024
    Copy the full SHA
    73e84cc View commit details
  15. Return deleted yarn.lock

    AI-Maria committed May 30, 2024
    Copy the full SHA
    4a9e5d2 View commit details
  16. Change defaults

    AI-Maria committed May 30, 2024
    Copy the full SHA
    b9fe008 View commit details

Commits on May 31, 2024

  1. Add migration for vision

    AI-Maria committed May 31, 2024
    Copy the full SHA
    0806865 View commit details

Commits on Jun 2, 2024

  1. Copy the full SHA
    4883290 View commit details

Commits on Jun 3, 2024

  1. Copy the full SHA
    202b2f3 View commit details
  2. Copy the full SHA
    69fe944 View commit details
  3. Fix types

    AI-Maria committed Jun 3, 2024
    Copy the full SHA
    4e37ff9 View commit details
  4. Copy the full SHA
    9e6f462 View commit details

Commits on Feb 3, 2025

  1. Copy the full SHA
    cbe50aa View commit details
  2. Update deploy.yml

    AI-Maria authored Feb 3, 2025
    Copy the full SHA
    c77bd36 View commit details
  3. Update deploy.yml

    AI-Maria authored Feb 3, 2025
    Copy the full SHA
    777a838 View commit details
  4. Update deploy.yml

    AI-Maria authored Feb 3, 2025
    Copy the full SHA
    8ef0796 View commit details
  5. Update deploy.yml

    AI-Maria authored Feb 3, 2025
    Copy the full SHA
    78d20fe View commit details
  6. Update deploy.yml

    AI-Maria authored Feb 3, 2025
    Copy the full SHA
    13ca722 View commit details
3 changes: 3 additions & 0 deletions .env
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
VITE_CUSTOM_API_ENDPOINT=https://bestwaifu.top/v1/chat/completions
VITE_DEFAULT_API_ENDPOINT=https://bestwaifu.top/v1/chat/completions
VITE_DEFAULT_SYSTEM_MESSAGE={maria_system}
4 changes: 2 additions & 2 deletions .github/workflows/deploy.yml
Original file line number Diff line number Diff line change
@@ -37,10 +37,10 @@ jobs:
VITE_GOOGLE_CLIENT_ID: ${{ secrets.GCLIENT }}

- name: Upload artifact
uses: actions/upload-pages-artifact@v1
uses: actions/upload-pages-artifact@v3.0.1
with:
path: './dist'

- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v1
uses: actions/deploy-pages@v4
1 change: 1 addition & 0 deletions public/locales/en/about.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
{
"bw": "This text about this GUI, not BestWaifu!",
"description": "Better ChatGPT is an amazing open-source web app that allows you to play with OpenAI's ChatGPT API for free!",
"sourceCode": "Checkout the <0>source code</0> on GitHub and give it a ⭐️!",
"initiative": {
1 change: 1 addition & 0 deletions public/locales/ru/about.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
{
"bw": "Данный текст о графическом интерфейсе, не о BestWaifu!",
"description": "Better ChatGPT - это потрясающее открытое веб-приложение, позволяющее вам бесплатно использовать API ChatGPT от OpenAI!",
"sourceCode": "Ознакомьтесь с <0>исходным кодом</0> на GitHub и поставьте ему ⭐️!",
"initiative": {
78 changes: 61 additions & 17 deletions src/api/api.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,10 @@
import {modelTypes} from '@constants/chat';
import { ShareGPTSubmitBodyInterface } from '@type/api';
import { ConfigInterface, MessageInterface, ModelOptions } from '@type/chat';
import { ConfigInterface, ImageContentInterface, MessageInterface, MessageInterfaceText, ModelOptions, TextContentInterface } from '@type/chat';
import { isAzureEndpoint } from '@utils/api';



export const getChatCompletion = async (
endpoint: string,
messages: MessageInterface[],
@@ -20,9 +23,6 @@ export const getChatCompletion = async (

const modelmapping: Partial<Record<ModelOptions, string>> = {
'gpt-3.5-turbo': 'gpt-35-turbo',
'gpt-3.5-turbo-16k': 'gpt-35-turbo-16k',
'gpt-3.5-turbo-1106': 'gpt-35-turbo-1106',
'gpt-3.5-turbo-0125': 'gpt-35-turbo-0125',
};

const model = modelmapping[config.model] || config.model;
@@ -43,14 +43,36 @@ export const getChatCompletion = async (
}
}

let body: BodyInit;

// make content: string if model support only text
if (modelTypes[config.model] == 'text') {
// Convert messages to MessageInterfaceText
const textMessages: MessageInterfaceText[] = messages.map(message => {
// Find the first text and ignore images
const textContent = message.content.find(content => content.type === 'text') as TextContentInterface;
return {
role: message.role,
content: textContent?.text || ''
};
});
body = JSON.stringify({
messages: textMessages,
...config,
max_tokens: undefined,
});
} else {
body = JSON.stringify({
messages,
...config,
max_tokens: undefined,
});
}

const response = await fetch(endpoint, {
method: 'POST',
headers,
body: JSON.stringify({
messages,
...config,
max_tokens: undefined,
}),
body: body,
});
if (!response.ok) throw new Error(await response.text());

@@ -76,7 +98,6 @@ export const getChatCompletionStream = async (

const modelmapping: Partial<Record<ModelOptions, string>> = {
'gpt-3.5-turbo': 'gpt-35-turbo',
'gpt-3.5-turbo-16k': 'gpt-35-turbo-16k',
};

const model = modelmapping[config.model] || config.model;
@@ -96,23 +117,46 @@ export const getChatCompletionStream = async (
}
}

let body: BodyInit;

// make content: string if model support only text
if (modelTypes[config.model] == 'text') {
// Convert messages to MessageInterfaceText
const textMessages: MessageInterfaceText[] = messages.map(message => {
// Find the first text and ignore images
const textContent = message.content.find(content => content.type === 'text') as TextContentInterface;
return {
role: message.role,
content: textContent?.text || ''
};
});
body = JSON.stringify({
messages: textMessages,
...config,
max_tokens: config.max_tokens,
stream: true,
});
} else {
body = JSON.stringify({
messages,
...config,
max_tokens: config.max_tokens,
stream: true,
});
}

const response = await fetch(endpoint, {
method: 'POST',
headers,
body: JSON.stringify({
messages,
...config,
max_tokens: undefined,
stream: true,
}),
body: body,
});
if (response.status === 404 || response.status === 405) {
const text = await response.text();

if (text.includes('model_not_found')) {
throw new Error(
text +
'\nMessage from Better ChatGPT:\nPlease ensure that you have access to the GPT-4 API!'
'\nMessage from Better ChatGPT:\nPlease ensure that you have access to the GPT-4 API!'
);
} else {
throw new Error(
4 changes: 2 additions & 2 deletions src/assets/icons/FolderIcon.tsx
Original file line number Diff line number Diff line change
@@ -5,8 +5,8 @@ const FolderIcon = (props: React.SVGProps<SVGSVGElement>) => {
<svg
viewBox='0 0 1024 1024'
fill='currentColor'
height='1em'
width='1em'
height='1.7em'
width='1.7em'
{...props}
>
<path d='M880 298.4H521L403.7 186.2a8.15 8.15 0 00-5.5-2.2H144c-17.7 0-32 14.3-32 32v592c0 17.7 14.3 32 32 32h736c17.7 0 32-14.3 32-32V330.4c0-17.7-14.3-32-32-32zM840 768H184V256h188.5l119.6 114.4H840V768z' />
1 change: 1 addition & 0 deletions src/components/AboutMenu/AboutMenu.tsx
Original file line number Diff line number Diff line change
@@ -28,6 +28,7 @@ const AboutMenu = () => {
>
<div className='p-6 border-b border-gray-200 dark:border-gray-600'>
<div className='min-w-fit text-gray-900 dark:text-gray-300 text-sm flex flex-col gap-3 leading-relaxed'>
<h1 className='text-lg font-bold text-red'>{t('bw', { ns: 'about' })} <a href="https://bestwaifu.top/" target="_blank" className="link">BestWaifu</a></h1>
<p>{t('description', { ns: 'about' })}</p>
<p>
<Trans
4 changes: 2 additions & 2 deletions src/components/ApiMenu/ApiMenu.tsx
Original file line number Diff line number Diff line change
@@ -83,7 +83,7 @@ const ApiMenu = ({
{t('apiKey.inputLabel', { ns: 'api' })}
</div>
<input
type='text'
type='password'
className='text-gray-800 dark:text-white p-3 text-sm border-none bg-gray-200 dark:bg-gray-600 rounded-md m-0 w-full mr-0 h-8 focus:outline-none'
value={_apiKey}
onChange={(e) => {
@@ -99,7 +99,7 @@ const ApiMenu = ({
ns='api'
components={[
<a
href='https://platform.openai.com/account/api-keys'
href='https://bestwaifu.top/user/'
className='link'
target='_blank'
/>,
2 changes: 1 addition & 1 deletion src/components/ApiPopup/ApiPopup.tsx
Original file line number Diff line number Diff line change
@@ -61,7 +61,7 @@ const ApiPopup = () => {
ns='api'
components={[
<a
href='https://platform.openai.com/account/api-keys'
href='https://bestwaifu.top/user/'
className='link'
target='_blank'
/>,
6 changes: 5 additions & 1 deletion src/components/Chat/ChatContent/ChatContent.tsx
Original file line number Diff line number Diff line change
@@ -12,6 +12,7 @@ import useSubmit from '@hooks/useSubmit';
import DownloadChat from './DownloadChat';
import CloneChat from './CloneChat';
import ShareGPT from '@components/ShareGPT';
import { ImageContentInterface, TextContentInterface } from '@type/chat';

const ChatContent = () => {
const inputRole = useStore((state) => state.inputRole);
@@ -79,7 +80,10 @@ const ChatContent = () => {

<Message
role={inputRole}
content=''
// For now we always initizlize a new message with an empty text content.
// It is possible to send a message to the API without a TextContentInterface,
// but the UI would need to be modified to allow the user to control the order of text and image content
content={[{type: 'text', text: ''} as TextContentInterface]}
messageIndex={stickyIndex}
sticky
/>
Original file line number Diff line number Diff line change
@@ -6,11 +6,12 @@ import { matchSorter } from 'match-sorter';
import { Prompt } from '@type/prompt';

import useHideOnOutsideClick from '@hooks/useHideOnOutsideClick';
import { ContentInterface } from '@type/chat';

const CommandPrompt = ({
_setContent,
}: {
_setContent: React.Dispatch<React.SetStateAction<string>>;
_setContent: React.Dispatch<React.SetStateAction<ContentInterface[]>>;
}) => {
const { t } = useTranslation();
const prompts = useStore((state) => state.prompts);
@@ -69,7 +70,7 @@ const CommandPrompt = ({
<li
className='px-4 py-2 hover:bg-gray-100 dark:hover:bg-gray-600 dark:hover:text-white cursor-pointer text-start w-full'
onClick={() => {
_setContent((prev) => prev + cp.prompt);
_setContent((prev) => [{type: 'text', text: prev + cp.prompt}, ...prev.slice(1)]);
setDropDown(false);
}}
key={cp.id}
4 changes: 2 additions & 2 deletions src/components/Chat/ChatContent/Message/Message.tsx
Original file line number Diff line number Diff line change
@@ -4,7 +4,7 @@ import useStore from '@store/store';
import Avatar from './Avatar';
import MessageContent from './MessageContent';

import { Role } from '@type/chat';
import { ContentInterface, Role } from '@type/chat';
import RoleSelector from './RoleSelector';

// const backgroundStyle: { [role in Role]: string } = {
@@ -22,7 +22,7 @@ const Message = React.memo(
sticky = false,
}: {
role: Role;
content: string;
content: ContentInterface[],
messageIndex: number;
sticky?: boolean;
}) => {
3 changes: 2 additions & 1 deletion src/components/Chat/ChatContent/Message/MessageContent.tsx
Original file line number Diff line number Diff line change
@@ -3,6 +3,7 @@ import useStore from '@store/store';

import ContentView from './View/ContentView';
import EditView from './View/EditView';
import { ContentInterface } from '@type/chat';

const MessageContent = ({
role,
@@ -11,7 +12,7 @@ const MessageContent = ({
sticky = false,
}: {
role: string;
content: string;
content: ContentInterface[];
messageIndex: number;
sticky?: boolean;
}) => {
4 changes: 2 additions & 2 deletions src/components/Chat/ChatContent/Message/NewMessageButton.tsx
Original file line number Diff line number Diff line change
@@ -3,7 +3,7 @@ import useStore from '@store/store';

import PlusIcon from '@icon/PlusIcon';

import { ChatInterface } from '@type/chat';
import { ChatInterface, TextContentInterface } from '@type/chat';
import { generateDefaultChat } from '@constants/chat';

const NewMessageButton = React.memo(
@@ -38,7 +38,7 @@ const NewMessageButton = React.memo(
JSON.stringify(useStore.getState().chats)
);
updatedChats[currentChatIndex].messages.splice(messageIndex + 1, 0, {
content: '',
content: [{'type': 'text', 'text': ''} as TextContentInterface],
role: 'user',
});
setChats(updatedChats);
17 changes: 12 additions & 5 deletions src/components/Chat/ChatContent/Message/View/ContentView.tsx
Original file line number Diff line number Diff line change
@@ -19,7 +19,7 @@ import CrossIcon from '@icon/CrossIcon';

import useSubmit from '@hooks/useSubmit';

import { ChatInterface } from '@type/chat';
import { ChatInterface, ContentInterface, ImageContentInterface, TextContentInterface } from '@type/chat';

import { codeLanguageSubset } from '@constants/chat';

@@ -41,7 +41,7 @@ const ContentView = memo(
messageIndex,
}: {
role: string;
content: string;
content: ContentInterface[],
setIsEdit: React.Dispatch<React.SetStateAction<boolean>>;
messageIndex: number;
}) => {
@@ -100,7 +100,7 @@ const ContentView = memo(
};

const handleCopy = () => {
navigator.clipboard.writeText(content);
navigator.clipboard.writeText((content[0] as TextContentInterface).text);
};

return (
@@ -129,12 +129,19 @@ const ContentView = memo(
p,
}}
>
{content}
{(content[0] as TextContentInterface).text}
</ReactMarkdown>
) : (
<span className='whitespace-pre-wrap'>{content}</span>
<span className='whitespace-pre-wrap'>{(content[0] as TextContentInterface).text}</span>
)}
</div>
<div className="flex gap-4">
{(content.slice(1) as ImageContentInterface[]).map((image, index) => (
<div key={index} className="image-container">
<img src={image.image_url.url} alt={`uploaded-${index}`} className="h-20" />
</div>
))}
</div>
<div className='flex justify-end gap-2 w-full mt-2'>
{isDelete || (
<>
250 changes: 186 additions & 64 deletions src/components/Chat/ChatContent/Message/View/EditView.tsx
Original file line number Diff line number Diff line change
@@ -1,22 +1,24 @@
import React, { memo, useEffect, useState } from 'react';
import React, { memo, useEffect, useState, useRef, ChangeEvent } from 'react';
import { useTranslation } from 'react-i18next';
import useStore from '@store/store';

import useSubmit from '@hooks/useSubmit';

import { ChatInterface } from '@type/chat';
import { ChatInterface, Content, ContentInterface, ImageContentInterface, TextContentInterface } from '@type/chat';

import PopupModal from '@components/PopupModal';
import TokenCount from '@components/TokenCount';
import CommandPrompt from '../CommandPrompt';
import FolderIcon from '@icon/FolderIcon';
import { defaultModel, modelTypes } from '@constants/chat';

const EditView = ({
content,
content: content,
setIsEdit,
messageIndex,
sticky,
}: {
content: string;
content: ContentInterface[];
setIsEdit: React.Dispatch<React.SetStateAction<boolean>>;
messageIndex: number;
sticky?: boolean;
@@ -25,7 +27,7 @@ const EditView = ({
const setChats = useStore((state) => state.setChats);
const currentChatIndex = useStore((state) => state.currentChatIndex);

const [_content, _setContent] = useState<string>(content);
const [_content, _setContent] = useState<ContentInterface[]>(content);
const [isModalOpen, setIsModalOpen] = useState<boolean>(false);
const textareaRef = React.createRef<HTMLTextAreaElement>();

@@ -63,15 +65,62 @@ const EditView = ({
}
};

// convert message blob urls to base64
const blobToBase64 = async (blob: Blob) => {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result);
};
reader.readAsDataURL(blob);
});
};

const handleFileChange = async (e: React.ChangeEvent<HTMLInputElement>) => {
const files = e.target.files!;
const newImageURLs = Array.from(files).map((file: Blob) => URL.createObjectURL(file));
const newImages = await Promise.all(newImageURLs.map(async (url) => {
const blob = await fetch(url).then((r) => r.blob());
return {
type: 'image_url',
image_url: {
detail: 'auto',
url: await blobToBase64(blob) as string
}
} as ImageContentInterface
}));
const updatedContent = [..._content, ...newImages];

_setContent(updatedContent);
};

const handleImageDetailChange = (index: number, detail: string) => {
const updatedImages = [..._content];
updatedImages[index + 1].image_url.detail = detail;
_setContent(updatedImages);
}

const handleRemoveImage = (index: number) => {
const updatedImages = [..._content];
updatedImages.splice(index + 1, 1);

_setContent(updatedImages);
};

const handleSave = () => {
if (sticky && (_content === '' || useStore.getState().generating)) return;
if (sticky && ((_content[0] as TextContentInterface).text === '' || useStore.getState().generating)) return;
const updatedChats: ChatInterface[] = JSON.parse(
JSON.stringify(useStore.getState().chats)
);
const updatedMessages = updatedChats[currentChatIndex].messages;

if (sticky) {
updatedMessages.push({ role: inputRole, content: _content });
_setContent('');
_setContent([{
type: 'text',
text: ''
} as TextContentInterface]);
resetTextAreaHeight();
} else {
updatedMessages[messageIndex].content = _content;
@@ -87,11 +136,15 @@ const EditView = ({
JSON.stringify(useStore.getState().chats)
);
const updatedMessages = updatedChats[currentChatIndex].messages;

if (sticky) {
if (_content !== '') {
if ((_content[0] as TextContentInterface).text !== '') {
updatedMessages.push({ role: inputRole, content: _content });
}
_setContent('');
_setContent([{
type: 'text',
text: ''
} as TextContentInterface]);
resetTextAreaHeight();
} else {
updatedMessages[messageIndex].content = _content;
@@ -110,7 +163,7 @@ const EditView = ({
textareaRef.current.style.height = 'auto';
textareaRef.current.style.height = `${textareaRef.current.scrollHeight}px`;
}
}, [_content]);
}, [(_content[0] as TextContentInterface).text]);

useEffect(() => {
if (textareaRef.current) {
@@ -122,31 +175,34 @@ const EditView = ({
return (
<>
<div
className={`w-full ${
sticky
? 'py-2 md:py-3 px-2 md:px-4 border border-black/10 bg-white dark:border-gray-900/50 dark:text-white dark:bg-gray-700 rounded-md shadow-[0_0_10px_rgba(0,0,0,0.10)] dark:shadow-[0_0_15px_rgba(0,0,0,0.10)]'
: ''
}`}
className={`w-full ${sticky
? 'py-2 md:py-3 px-2 md:px-4 border border-black/10 bg-white dark:border-gray-900/50 dark:text-white dark:bg-gray-700 rounded-md shadow-[0_0_10px_rgba(0,0,0,0.10)] dark:shadow-[0_0_15px_rgba(0,0,0,0.10)]'
: ''
}`}
>
<textarea
ref={textareaRef}
className='m-0 resize-none rounded-lg bg-transparent overflow-y-hidden focus:ring-0 focus-visible:ring-0 leading-7 w-full placeholder:text-gray-500/40'
onChange={(e) => {
_setContent(e.target.value);
_setContent((prev) => [{ type: 'text', text: e.target.value }, ...prev.slice(1)])
}}
value={_content}
value={(_content[0] as TextContentInterface).text}
placeholder={t('submitPlaceholder') as string}
onKeyDown={handleKeyDown}
rows={1}
></textarea>
</div>
<EditViewButtons
sticky={sticky}
handleFileChange={handleFileChange}
handleImageDetailChange={handleImageDetailChange}
handleRemoveImage={handleRemoveImage}
handleGenerate={handleGenerate}
handleSave={handleSave}
setIsModalOpen={setIsModalOpen}
setIsEdit={setIsEdit}
_setContent={_setContent}
_content={_content}
/>
{isModalOpen && (
<PopupModal
@@ -163,83 +219,149 @@ const EditView = ({
const EditViewButtons = memo(
({
sticky = false,
handleFileChange,
handleImageDetailChange,
handleRemoveImage,
handleGenerate,
handleSave,
setIsModalOpen,
setIsEdit,
_setContent,
_content
}: {
sticky?: boolean;
handleFileChange: (e: React.ChangeEvent<HTMLInputElement>) => void;
handleImageDetailChange: (index: number, e: string) => void;
handleRemoveImage: (index: number) => void;
handleGenerate: () => void;
handleSave: () => void;
setIsModalOpen: React.Dispatch<React.SetStateAction<boolean>>;
setIsEdit: React.Dispatch<React.SetStateAction<boolean>>;
_setContent: React.Dispatch<React.SetStateAction<string>>;
_setContent: React.Dispatch<React.SetStateAction<ContentInterface[]>>;
_content: ContentInterface[];
}) => {
const { t } = useTranslation();
const generating = useStore.getState().generating;
const advancedMode = useStore((state) => state.advancedMode);
const model = useStore((state) => state.chats != undefined ? state.chats![state.currentChatIndex].config.model : defaultModel);
const fileInputRef = useRef(null);

const handleUploadButtonClick = () => {
// Trigger the file input when the custom button is clicked
(fileInputRef.current! as HTMLInputElement).click();
};

return (
<div className='flex'>
<div className='flex-1 text-center mt-2 flex justify-center'>
{sticky && (
<button
className={`btn relative mr-2 btn-primary ${
generating ? 'cursor-not-allowed opacity-40' : ''
}`}
onClick={handleGenerate}
aria-label={t('generate') as string}
>
<div className='flex items-center justify-center gap-2'>
{t('generate')}
</div>
</button>
)}
<div>
{modelTypes[model] == 'image' && (
<div className='flex justify-center'>
<div className="flex gap-5">
{_content.slice(1).map((image, index) => (
<div key={index} className="image-container flex flex-col gap-2">
<img src={image.image_url.url} alt={`uploaded-${index}`} className="h-10" />
<div className='flex flex-row gap-3'>
<select
onChange={(event) => handleImageDetailChange(index, event.target.value)}
title="Select image resolution"
aria-label="Select image resolution"
defaultValue={image.image_url.detail}
style={{ color: 'black' }}
>
<option value="auto">Auto</option>
<option value="high">High</option>
<option value="low">Low</option>
</select>
<button
className="close-button"
onClick={() => handleRemoveImage(index)}
aria-label="Remove Image"
>
&times;
</button>
</div>
</div>
))}

{sticky || (
<button
className='btn relative mr-2 btn-primary'
onClick={() => {
!generating && setIsModalOpen(true);
}}
className='btn relative btn-neutral h-16 w-16'
onClick={handleUploadButtonClick}
aria-label={'Upload Images'}
>
<div className='flex items-center justify-center gap-2'>
{t('generate')}
<div className='flex items-center justify-center gap-2 h-full w-full'>
<FolderIcon />
</div>
</button>
)}

<button
className={`btn relative mr-2 ${
sticky
? `btn-neutral ${
generating ? 'cursor-not-allowed opacity-40' : ''
}`
: 'btn-neutral'
}`}
onClick={handleSave}
aria-label={t('save') as string}
>
<div className='flex items-center justify-center gap-2'>
{t('save')}

</div>
</button>

{sticky || (
{/* Hidden file input */}
<input
type="file"
ref={fileInputRef}
style={{ display: 'none' }}
onChange={handleFileChange}
accept="image/*"
multiple
/>
</div>
)}
<div className='flex'>
<div className='flex-1 text-center mt-2 flex justify-center'>
{sticky && (
<button
className={`btn relative mr-2 btn-primary ${generating ? 'cursor-not-allowed opacity-40' : ''
}`}
onClick={handleGenerate}
aria-label={t('generate') as string}
>
<div className='flex items-center justify-center gap-2'>
{t('generate')}
</div>
</button>
)}

{sticky || (
<button
className='btn relative mr-2 btn-primary'
onClick={() => {
!generating && setIsModalOpen(true);
}}
>
<div className='flex items-center justify-center gap-2'>
{t('generate')}
</div>
</button>
)}

<button
className='btn relative btn-neutral'
onClick={() => setIsEdit(false)}
aria-label={t('cancel') as string}
className={`btn relative mr-2 ${sticky
? `btn-neutral ${generating ? 'cursor-not-allowed opacity-40' : ''
}`
: 'btn-neutral'
}`}
onClick={handleSave}
aria-label={t('save') as string}
>
<div className='flex items-center justify-center gap-2'>
{t('cancel')}
{t('save')}
</div>
</button>
)}

{sticky || (
<button
className='btn relative btn-neutral'
onClick={() => setIsEdit(false)}
aria-label={t('cancel') as string}
>
<div className='flex items-center justify-center gap-2'>
{t('cancel')}
</div>
</button>
)}
</div>
{sticky && advancedMode && <TokenCount />}
<CommandPrompt _setContent={_setContent} />
</div>
{sticky && advancedMode && <TokenCount />}
<CommandPrompt _setContent={_setContent} />
</div>
);
}
20 changes: 20 additions & 0 deletions src/components/Menu/MenuOptions/BestWaifu.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import React from 'react';
import { useTranslation } from 'react-i18next';

import HeartIcon from '@icon/HeartIcon';

const BestWaifu = () => {
const { t } = useTranslation();
return (
<a
className='flex py-2 px-2 items-center gap-3 rounded-md hover:bg-gray-500/10 transition-colors duration-200 text-white cursor-pointer text-sm'
href='https://bestwaifu.top/'
target='_blank'
>
<HeartIcon />
BestWaifu
</a>
);
};

export default BestWaifu;
2 changes: 2 additions & 0 deletions src/components/Menu/MenuOptions/MenuOptions.tsx
Original file line number Diff line number Diff line change
@@ -3,6 +3,7 @@ import useStore from '@store/store';

import Api from './Api';
import Me from './Me';
import BestWaifu from './BestWaifu';
import AboutMenu from '@components/AboutMenu';
import ImportExportChat from '@components/ImportExportChat';
import SettingsMenu from '@components/SettingsMenu';
@@ -25,6 +26,7 @@ const MenuOptions = () => {
>
{countTotalTokens && <TotalTokenCostDisplay />}
{googleClientId && <GoogleSync clientId={googleClientId} />}
<BestWaifu />
<AboutMenu />
<ImportExportChat />
<Api />
4 changes: 2 additions & 2 deletions src/components/SettingsMenu/TotalTokenCost.tsx
Original file line number Diff line number Diff line change
@@ -54,7 +54,7 @@ const TotalTokenCost = () => {
<thead className='text-xs text-gray-700 uppercase bg-gray-50 dark:bg-gray-700 dark:text-gray-400'>
<tr>
<th className='px-4 py-2'>{t('model', { ns: 'model' })}</th>
<th className='px-4 py-2'>USD</th>
<th className='px-4 py-2'>kWT</th>
</tr>
</thead>
<tbody>
@@ -128,7 +128,7 @@ export const TotalTokenCostDisplay = () => {
return (
<a className='flex py-2 px-2 items-center gap-3 rounded-md hover:bg-gray-500/10 transition-colors duration-200 text-white text-sm'>
<CalculatorIcon />
{`USD ${totalCost.toPrecision(3)}`}
{`${totalCost.toPrecision(3)} kWT`}
</a>
);
};
2 changes: 1 addition & 1 deletion src/components/TokenCount/TokenCount.tsx
Original file line number Diff line number Diff line change
@@ -34,7 +34,7 @@ const TokenCount = React.memo(() => {
return (
<div className='absolute top-[-16px] right-0'>
<div className='text-xs italic text-gray-900 dark:text-gray-300'>
Tokens: {tokenCount} (${cost})
Tokens: {tokenCount} ({cost} kWT)
</div>
</div>
);
62 changes: 22 additions & 40 deletions src/constants/chat.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import { v4 as uuidv4 } from 'uuid';
import { ChatInterface, ConfigInterface, ModelOptions } from '@type/chat';
import { ChatInterface, ConfigInterface, ModelOptions, ModelType, TextContentInterface } from '@type/chat';
import useStore from '@store/store';

const date = new Date();
@@ -14,31 +14,18 @@ const dateString =
export const _defaultSystemMessage =
import.meta.env.VITE_DEFAULT_SYSTEM_MESSAGE ??
`You are ChatGPT, a large language model trained by OpenAI.
Carefully heed the user's instructions.
Carefully heed the user's instructions.
Respond using Markdown.`;

export const modelOptions: ModelOptions[] = [
'gpt-3.5-turbo',
'gpt-3.5-turbo-16k',
'gpt-3.5-turbo-1106',
'gpt-3.5-turbo-0125',
'gpt-4',
'gpt-4-32k',
'gpt-4-1106-preview',
'gpt-4-0125-preview',
'gpt-4-turbo',
'gpt-4-turbo-2024-04-09',
'gpt-4o',
'gpt-4o-2024-05-13',
// 'gpt-3.5-turbo-0301',
// 'gpt-4-0314',
// 'gpt-4-32k-0314',
'gpt-4-vision-preview',
];

export const defaultModel = 'gpt-3.5-turbo';

export const modelMaxToken = {
'gpt-3.5-turbo': 4096,
'gpt-3.5-turbo': 8192,
'gpt-3.5-turbo-0301': 4096,
'gpt-3.5-turbo-0613': 4096,
'gpt-3.5-turbo-16k': 16384,
@@ -53,16 +40,13 @@ export const modelMaxToken = {
'gpt-4-32k-0613': 32768,
'gpt-4-1106-preview': 128000,
'gpt-4-0125-preview': 128000,
'gpt-4-turbo': 128000,
'gpt-4-turbo-2024-04-09': 128000,
'gpt-4o': 128000,
'gpt-4o-2024-05-13': 128000,
'gpt-4-vision-preview': 4096
};

export const modelCost = {
'gpt-3.5-turbo': {
prompt: { price: 0.0015, unit: 1000 },
completion: { price: 0.002, unit: 1000 },
prompt: { price: 5, unit: 1000 },
completion: { price: 15, unit: 1000 },
},
'gpt-3.5-turbo-0301': {
prompt: { price: 0.0015, unit: 1000 },
@@ -120,31 +104,29 @@ export const modelCost = {
prompt: { price: 0.01, unit: 1000 },
completion: { price: 0.03, unit: 1000 },
},
'gpt-4-turbo': {
prompt: { price: 0.01, unit: 1000 },
completion: { price: 0.03, unit: 1000 },
},
'gpt-4-turbo-2024-04-09': {
'gpt-4-vision-preview': {
prompt: { price: 0.01, unit: 1000 },
completion: { price: 0.03, unit: 1000 },
},
'gpt-4o': {
prompt: { price: 0.005, unit: 1000 },
completion: { price: 0.015, unit: 1000 },
},
'gpt-4o-2024-05-13': {
prompt: { price: 0.005, unit: 1000 },
completion: { price: 0.015, unit: 1000 },
},
}
};

type ModelTypes = {
[x in ModelOptions]: ModelType;
};

// Types of input the model can support. If image, show an image upload button
export const modelTypes: ModelTypes = {
'gpt-3.5-turbo': 'text',
'gpt-4-vision-preview': 'image'
};

export const defaultUserMaxToken = 4000;

export const _defaultChatConfig: ConfigInterface = {
model: defaultModel,
max_tokens: defaultUserMaxToken,
temperature: 1,
presence_penalty: 0,
temperature: 0.4,
presence_penalty: 0.1,
top_p: 1,
frequency_penalty: 0,
};
@@ -157,7 +139,7 @@ export const generateDefaultChat = (
title: title ? title : 'New Chat',
messages:
useStore.getState().defaultSystemMessage.length > 0
? [{ role: 'system', content: useStore.getState().defaultSystemMessage }]
? [{ role: 'system', content: [{type: 'text', text: useStore.getState().defaultSystemMessage} as TextContentInterface] }]
: [],
config: { ...useStore.getState().defaultChatConfig },
titleSet: false,
29 changes: 24 additions & 5 deletions src/hooks/useSubmit.ts
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import React from 'react';
import useStore from '@store/store';
import { useTranslation } from 'react-i18next';
import { ChatInterface, MessageInterface } from '@type/chat';
import { ChatInterface, ContentInterface, MessageInterface, TextContentInterface } from '@type/chat';
import { getChatCompletion, getChatCompletionStream } from '@api/api';
import { parseEventSource } from '@api/helper';
import { limitMessageTokens, updateTotalTokenUsed } from '@utils/messageUtils';
@@ -59,7 +59,10 @@ const useSubmit = () => {

updatedChats[currentChatIndex].messages.push({
role: 'assistant',
content: '',
content: [{
type: 'text',
text: ''
} as TextContentInterface],
});

setChats(updatedChats);
@@ -132,7 +135,7 @@ const useSubmit = () => {
JSON.stringify(useStore.getState().chats)
);
const updatedMessages = updatedChats[currentChatIndex].messages;
updatedMessages[updatedMessages.length - 1].content += resultString;
(updatedMessages[updatedMessages.length - 1].content[0] as TextContentInterface).text += resultString;
setChats(updatedChats);
}
}
@@ -165,15 +168,31 @@ const useSubmit = () => {
currChats &&
!currChats[currentChatIndex]?.titleSet
) {
function extractContentMessages(contents : ContentInterface[]) {
return contents.map(content => {
if (content.type === 'text') {
return content.text;
} else if (content.type === 'image_url') {
return '(image)';
}
return '';
}).join(' ');
}
const messages_length = currChats[currentChatIndex].messages.length;
const assistant_message =
currChats[currentChatIndex].messages[messages_length - 1].content;
const user_message =
currChats[currentChatIndex].messages[messages_length - 2].content;

const assistant_content_string = extractContentMessages(assistant_message);
const user_content_string = extractContentMessages(user_message);

const message: MessageInterface = {
role: 'user',
content: `Generate a title in less than 6 words for the following message (language: ${i18n.language}):\n"""\nUser: ${user_message}\nAssistant: ${assistant_message}\n"""`,
content: [{
type: 'text',
text: `Generate a title in less than 6 words for the following message (language: ${i18n.language}):\n"""\nUser: ${user_content_string}\nAssistant: ${assistant_content_string}\n"""`,
} as TextContentInterface]
};

let title = (await generateTitle([message])).trim();
@@ -192,7 +211,7 @@ const useSubmit = () => {
const model = _defaultChatConfig.model;
updateTotalTokenUsed(model, [message], {
role: 'assistant',
content: title,
content: [{type: 'text', text: title} as TextContentInterface],
});
}
}
4 changes: 2 additions & 2 deletions src/store/config-slice.ts
Original file line number Diff line number Diff line change
@@ -42,9 +42,9 @@ export const createConfigSlice: StoreSlice<ConfigSlice> = (set, get) => ({
advancedMode: true,
defaultChatConfig: _defaultChatConfig,
defaultSystemMessage: _defaultSystemMessage,
inlineLatex: false,
inlineLatex: true,
markdownMode: true,
countTotalTokens: false,
countTotalTokens: true,
totalTokenUsed: {},
setOpenConfig: (openConfig: boolean) => {
set((prev: ConfigSlice) => ({
17 changes: 17 additions & 0 deletions src/store/migrate.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import { v4 as uuidv4 } from 'uuid';

import {
ContentInterface,
Folder,
FolderCollection,
LocalStorageInterfaceV0ToV1,
@@ -104,3 +105,19 @@ export const migrateV7 = (persistedState: LocalStorageInterfaceV7oV8) => {
chat.id = uuidv4();
});
};

export const migrateV8 = (persistedState: LocalStorageInterfaceV7oV8) => {
persistedState.chats.forEach((chat) => {
chat.messages.forEach((message) => {
// Check if the old content structure exists
if (typeof message.content === 'string') {
// Convert the old content string to the new content array structure
message.content = [{
type: 'text', // assuming all old content is of type 'text'
text: message.content
}] as ContentInterface[];
}
});
});
};

7 changes: 6 additions & 1 deletion src/store/store.ts
Original file line number Diff line number Diff line change
@@ -15,6 +15,7 @@ import {
LocalStorageInterfaceV5ToV6,
LocalStorageInterfaceV6ToV7,
LocalStorageInterfaceV7oV8,
LocalStorageInterfaceV8oV9,
} from '@type/chat';
import {
migrateV0,
@@ -25,6 +26,7 @@ import {
migrateV5,
migrateV6,
migrateV7,
migrateV8,
} from './migrate';

export type StoreState = ChatSlice &
@@ -74,7 +76,7 @@ const useStore = create<StoreState>()(
{
name: 'free-chat-gpt',
partialize: (state) => createPartializedState(state),
version: 8,
version: 9,
migrate: (persistedState, version) => {
switch (version) {
case 0:
@@ -94,6 +96,9 @@ const useStore = create<StoreState>()(
case 7:
migrateV7(persistedState as LocalStorageInterfaceV7oV8);
break;
case 8:
migrateV8(persistedState as LocalStorageInterfaceV8oV9);
break;
}
return persistedState as StoreState;
},
54 changes: 38 additions & 16 deletions src/types/chat.ts
Original file line number Diff line number Diff line change
@@ -1,12 +1,40 @@
import { Prompt } from './prompt';
import { Theme } from './theme';

// The types in this file must mimick the structure of the the API request

export type Content = 'text' | 'image_url';
export type ImageDetail = 'low' | 'high' | 'auto';
export const imageDetails: ImageDetail[] = ['auto'];
export type Role = 'user' | 'assistant' | 'system';
export const roles: Role[] = ['user', 'assistant', 'system'];

export interface ImageContentInterface extends ContentInterface {
type: 'image_url';
image_url: {
url: string; // base64 or image URL
detail: ImageDetail;
}
}

export interface TextContentInterface extends ContentInterface {
type: 'text';
text: string;
}

export interface ContentInterface {
[x: string]: any;
type: Content;
}

export interface MessageInterfaceText {
role: Role;
content: String;
}

export interface MessageInterface {
role: Role;
content: string;
content: ContentInterface[];
}

export interface ChatInterface {
@@ -50,21 +78,10 @@ export interface Folder {
}

export type ModelOptions =
| 'gpt-4o'
| 'gpt-4o-2024-05-13'
| 'gpt-4'
| 'gpt-4-32k'
| 'gpt-4-1106-preview'
| 'gpt-4-0125-preview'
| 'gpt-4-turbo'
| 'gpt-4-turbo-2024-04-09'
| 'gpt-3.5-turbo'
| 'gpt-3.5-turbo-16k'
| 'gpt-3.5-turbo-1106'
| 'gpt-3.5-turbo-0125';
// | 'gpt-3.5-turbo-0301';
// | 'gpt-4-0314'
// | 'gpt-4-32k-0314'
'gpt-3.5-turbo'
| 'gpt-4-vision-preview';

export type ModelType = 'text' | 'image';

export type TotalTokenUsed = {
[model in ModelOptions]?: {
@@ -159,3 +176,8 @@ export interface LocalStorageInterfaceV7oV8
foldersExpanded: boolean[];
folders: FolderCollection;
}

export interface LocalStorageInterfaceV8oV9
extends LocalStorageInterfaceV7oV8 {
}

4 changes: 3 additions & 1 deletion src/utils/import.ts
Original file line number Diff line number Diff line change
@@ -5,6 +5,7 @@ import {
ConfigInterface,
FolderCollection,
MessageInterface,
TextContentInterface,
} from '@type/chat';
import { roles } from '@type/chat';
import {
@@ -102,7 +103,8 @@ export const convertOpenAIToBetterChatGPTFormat = (
// Extract message if it exists
if (node.message) {
const { role } = node.message.author;
const content = node.message.content.parts?.join('') || '';
const text = node.message.content.parts?.join('') || '';
const content = [{'type': 'text', 'text': text} as TextContentInterface];
if (content.length > 0) messages.push({ role, content });
}

15 changes: 8 additions & 7 deletions src/utils/messageUtils.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { MessageInterface, ModelOptions, TotalTokenUsed } from '@type/chat';
import { MessageInterface, ModelOptions, TextContentInterface, TotalTokenUsed } from '@type/chat';

import useStore from '@store/store';

@@ -29,7 +29,7 @@ export const getChatGPTEncoding = (
const serialized = [
messages
.map(({ role, content }) => {
return `<|im_start|>${role}${roleSep}${content}<|im_end|>`;
return `<|im_start|>${role}${roleSep}${(content[0] as TextContentInterface).text}<|im_end|>`;
})
.join(msgSep),
`<|im_start|>assistant${roleSep}`,
@@ -64,20 +64,21 @@ export const limitMessageTokens = (
}

// Iterate through messages in reverse order, adding them to the limitedMessages array
// until the token limit is reached (excludes first message)
// until the token limit is reached (excludes the first message)
for (let i = messages.length - 1; i >= 1; i--) {
const count = countTokens([messages[i]], model);
if (count + tokenCount > limit) break;
tokenCount += count;
// unshift so that messages remain in their original order
limitedMessages.unshift({ ...messages[i] });
}

// Process first message
// Process the first message:
// If it's a system message and should be retained, always place it at the top.
if (retainSystemMessage) {
// Insert the system message in the third position from the end
limitedMessages.splice(-3, 0, { ...messages[0] });
limitedMessages.unshift({ ...messages[0] });
} else if (!isSystemFirstMessage) {
// Check if the first message (non-system) can fit within the limit
// For non-system messages, check if the first message can fit within the limit
const firstMessageTokenCount = countTokens([messages[0]], model);
if (firstMessageTokenCount + tokenCount < limit) {
limitedMessages.unshift({ ...messages[0] });
1,577 changes: 646 additions & 931 deletions yarn.lock

Large diffs are not rendered by default.