type
status
date
slug
tags
summary
category
password
icon
来自Reddit,链接: Why prompt is the future? : ChatGPT (reddit.com))
hey everyone,
Since January when I built FlowGPT, (now the largest prompt community), the question I've encountered most frequently is, 'is prompt still needed in the future?' While opinions vary, I always believe prompt is the future. I want to share my perspective with you and would love to hear your thought!~
What is Prompt Exactly?
Instructions
If we were to personify ChatGPT as a highly skilled and knowledgeable employee, then a prompt would be the instruction you give. The more detailed and context-rich the instruction, the clearer the output definition, the better the result.
Content
Due to the strong correlation between prompts and output, the market values prompts highly. Some speculators sell prompt bundles through advertisements, while others trade prompts on marketplaces like Promptbase. The most popular approach is to package high-quality prompts with a UI layer, turning them into SaaS products (e.g., Jasper, CopyAI). Essentially, these companies rent out prompts to derive value, except for those with unique data or large models of their own.
We believe prompts are not merely commodities to be rented or sold. Instead, like short videos and code, they should be shared and discussed.
With the rise of ChatGPT, the market’s awareness of prompts has rapidly increased. Influencers in various niche fields have shared numerous free and useful prompts, causing an explosion in the supply of prompts as more people join in on creating them. Jasper’s users discovered that they could use freely available prompts with ChatGPT, and even if the results weren’t as good as Jasper’s, they still flocked to ChatGPT. The future market is unlikely to pay for prompts because users have too many free options. Low-barrier, highly diverse prompts will become a new form of content, shared, consumed, and iterated upon.
Next-Gen (Natural) Programming Languages
Code vs. Prompt:
What excites us most about prompts is their ability to significantly lower the barrier for users to manipulate data in bulk, turning anyone into a software engineer and empowering them to create solutions for the information age.
Take software as an example. Traditionally, programmers create code while end-users utilize it. Product managers are needed to understand user needs and translate them into product features for developers to implement. This process is riddled with communication-based information loss, making it highly inefficient. The reason this process exists is that end-users lack the time and skills to write code and create software.
The advent of prompts revolutionizes this process and democratizes problem resolution. Now, users only need to describe their problem and relevant steps in natural language to create personalized solutions, eliminating the need for a dedicated team to identify problems and iterate on product-market fit. Users themselves become the best product managers.
Prompts have much lower creation barriers, more powerful capabilities, and a wider range of use cases than code, thus boasting a larger volume.
A Vehicle for Product Imagination
In the past, people needed imagination, specialized skills, and time to create. For example, writing a story or building an app required envisioning content or features and then using writing or programming skills to bring them to life. This could require an individual or a team, with the issue being the long, costly process of developing these skills, which leads to specialization.
However, generative models now enable anyone to possess expert-level skills. To realize any idea, users simply need to describe it with a prompt, allowing for rapid implementation and iteration. Prompts have become vehicles for imagination.
Before GPT, there was little value in capturing imagination, as only final products were used or viewed. Thus, we saw platforms like the App Store, YouTube, and Spotify, which showcased finished products. Now, imagination itself is the product.
The ultimate form of prompt engineering is exploring the use cases for large models. Currently, we face a very realistic problem: people don’t know what AI can be used for. For example, before hearing about HustleGPT (making ChatGPT act as a boss to help you make money), we had no idea that ChatGPT could be used in such a way. Moreover, when we see the various prompts uploaded by users on FlowGPT every day, we are always amazed by their imagination, as we had never thought of such use cases before.
How will prompts develop?
More complex
Prompts will become more complex to accomplish increasingly intricate and specialized tasks. After the update from ChatGPT to GPT-4, we communicated with many community members and made numerous observations. We found that GPT-4 prompts are generally more complex and longer than ChatGPT prompts.
We can draw an analogy with code. Early on, there were no advanced programming languages, and code that can now be written in a single line in Python required dozens of lines. At the same time, computers (the carriers of code) had limited memory and display capabilities, so early software was rudimentary compared to today. However, with the emergence of more advanced programming languages, frameworks, and improvements in hardware, software complexity has increased, enabling the present-day internet software era.
A similar transformation is happening with prompts. GPT-4 has a stronger understanding of natural language, meaning that logic that took ChatGPT several sentences to explain can now be expressed in a single sentence with GPT-4. Prompt engineers can write more accessible, modifiable, and scalable prompts. Also, GPT-4’s cache has been increased by 8 times, significantly raising the complexity limit of prompts and rapidly expanding the complexity and scale of solvable problems. In the community, ChatGPT prompts were typically used for simple tasks like editing resumes, writing copy, and planning, while GPT-4 prompts include text adventure games, game engines, and prompt programming frameworks.
More powerful
Prompt capability improvements come from the upgrade of the large model itself. For example, once GPT can access the internet, all prompts are endowed with real-time information retrieval and search capabilities, adding another dimension to complexity. The ChatGPT Plugin Store is similar, allowing prompts to use other SaaS software to store and retrieve data and perform operations. Later, it can be further linked to Thought Domain data and even physical world capabilities. The range of tasks that prompts can handle is continually expanding. As the model itself becomes more potent and diverse, the volume of solvable problems by prompts will rapidly increase.
More diverse
Prompts are the simplest form of AI-native applications and represent a new type of content.
We believe that there are two ways AIGC can integrate into industries:
Advanced prompt control and capability expansion will lead to various AI-native applications, such as chains of prompts, autonomous agents that generate more prompts based on objectives (AutoGPT), and context-aware chatbots that provide private domain data (vector embedding). Here’s an example of a prompt workflow.
In the future, creating an AI+ picture book only requires a title. A ChatGPT prompt will generate a story outline based on that title, and another prompt will create image prompts for Midjourney based on the outline. Midjourney will produce illustrations, followed by a large model that generates environmental descriptions based on the images. Lastly, a multimodal large model will edit, format, and output a complete picture book story as a PDF.
Each module is a unique combination of “Prompt + Large Model,” and collaboration between multiple modules forms an AI-native workflow, or flow. It’s essential to note that the person on Twitter was a software engineer with no illustration or writing experience. In the past, this task would have required at least three people using Word,Photoshop, and Adobe. Now, you only need to know how to write prompts and design flows. Such flows will become more numerous and complex in the future.
If you have any feedback, feel free to drop a comment. If you are interested in my project, we are a creator-centric platform with 20000 + prompts. We are hosting a very very interesting prompt Hackathon (see pin post on r/ChatGPT.
This is my website: https://flowgpt.com/
FlowGPT
精彩评论:
评论一(貌似是AI生成的,但是感觉很有道理哈哈):
While I appreciate the thoroughness of your post and the potential of prompts in democratizing the creation process, I'm a tad skeptical. While prompts indeed lower the barrier of entry, they may also introduce a layer of abstraction that could lead to oversimplification.(认为ChatGPT虽然的确降低了门槛,但是可能会让人们产生把问题想得过于简单的倾向) Just as relying solely on high-level programming languages can limit one's understanding of underlying computational processes, over-reliance on prompts might detach users from the nuances and complexities of the problems they're trying to solve. We should champion prompts as a powerful tool, but ensure that we're not sacrificing depth for accessibility.
评论二:
It doesn’t take a rocket scientist to figure out that LLMs will be the go-to for solution gathering, most likely in a prompt-generated format. It’s no secret that people want the answers now instead of searching for them (e.g., search engines).
However, AI+prompt empowers the 'users' to develop a solution and solve local problems easily.
This isn’t actually developing a solution. This is deploying critical thinking via prompts. What’s happening here is that you have programmers creating the platform for, let’s call them, prompt engineers to write out IF AND THEN statements because they have the ability to think critically. Then, you have users resorting to drop-down lists.
The issue with this is, over time, the number of programmers and the number of prompt engineers will likely fade due to a decrease in the population with the ability to think critically.(认为人们过度依赖于ChatGPT进行逻辑思维,可能久而久之自己逻辑思维的能力会减弱)
Your issue is you’re using words like solution and prompt to try and convey your accomplishment. That’s dope, but not a solution.
The data collection used from your end-users to further develop your model is what’s needed. I get it. I kind of hope a privacy law shuts all of you down because, honestly, you’re all annoying. At the same time, I don’t, because LLMs have not only improved my livelihood and productivity, but they’re also further increasing the gap between me and the rest of society.
That’s what people don’t realize. Those who know how to fully utilize the tools will only further their own standing. Just like you.
欢迎您在底部评论区留言,一起交流~
- 作者:文雅的疯狂
- 链接:https://aiexplorer.rest//why-prompt-is-the-future
- 声明:本文采用 CC BY-NC-SA 4.0 许可协议,转载请注明出处。