Beta/v0.4.0 release (#297)
This commit is contained in:
parent
b273f44cac
commit
a30bfcf7d0
|
@ -1,10 +1,12 @@
|
||||||
*.db
|
*.db
|
||||||
*.bak
|
*.bak
|
||||||
|
*.log
|
||||||
|
*.mjs
|
||||||
_old
|
_old
|
||||||
rice-box.go
|
rice-box.go
|
||||||
.idea/
|
.idea/
|
||||||
/backend
|
/backend/backend
|
||||||
/backend.exe
|
/backend/backend.exe
|
||||||
/frontend/dist
|
/frontend/dist
|
||||||
/frontend/pkg
|
/frontend/pkg
|
||||||
/frontend/test-results
|
/frontend/test-results
|
||||||
|
|
13
CHANGELOG.md
13
CHANGELOG.md
|
@ -2,6 +2,19 @@
|
||||||
|
|
||||||
All notable changes to this project will be documented in this file. For commit guidelines, please refer to [Standard Version](https://github.com/conventional-changelog/standard-version).
|
All notable changes to this project will be documented in this file. For commit guidelines, please refer to [Standard Version](https://github.com/conventional-changelog/standard-version).
|
||||||
|
|
||||||
|
## v0.4.0-beta
|
||||||
|
|
||||||
|
**New Features**
|
||||||
|
- Better logging https://github.com/gtsteffaniak/filebrowser/issues/288
|
||||||
|
- highly configurable
|
||||||
|
- api logs include user
|
||||||
|
- onlyOffice support for editing only office files (inspired from https://github.com/filebrowser/filebrowser/pull/2954)
|
||||||
|
|
||||||
|
**Notes**
|
||||||
|
- Breadcrumbs will only show on file listing (not on previews or editors)
|
||||||
|
- Config file is now optional. It will run with default settings without one and throw a `[WARN ]` message.
|
||||||
|
- Added more descriptions to swagger API
|
||||||
|
|
||||||
## v0.3.7-beta
|
## v0.3.7-beta
|
||||||
|
|
||||||
**Notes**:
|
**Notes**:
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
FROM gtstef/playwright-base
|
FROM gtstef/playwright-base
|
||||||
WORKDIR /app
|
WORKDIR /app/frontend
|
||||||
COPY [ "./backend/filebrowser*", "./"]
|
|
||||||
COPY [ "./frontend/", "./" ]
|
COPY [ "./frontend/", "./" ]
|
||||||
RUN ./filebrowser -c filebrowser-playwright.yaml & sleep 2 && npx playwright test
|
WORKDIR /app/backend/
|
||||||
|
COPY [ "./backend/filebrowser*", "./"]
|
||||||
|
RUN ./filebrowser -c filebrowser-playwright.yaml & sleep 2 && cd ../frontend && npx playwright test
|
||||||
|
|
|
@ -1,5 +1,4 @@
|
||||||
FROM node:22-slim
|
FROM node:22-slim
|
||||||
WORKDIR /app
|
WORKDIR /app/frontend
|
||||||
COPY ./frontend/package.json ./
|
|
||||||
RUN npm i @playwright/test
|
RUN npm i @playwright/test
|
||||||
RUN npx playwright install --with-deps firefox
|
RUN npx playwright install --with-deps firefox
|
||||||
|
|
12
README.md
12
README.md
|
@ -98,6 +98,9 @@ See the [API Wiki](https://github.com/gtsteffaniak/filebrowser/wiki/API)
|
||||||
|
|
||||||
Configuration is done via the `config.yaml`, see the [Configuration Wiki](https://github.com/gtsteffaniak/filebrowser/wiki/Configuration) for available configuration options and other help.
|
Configuration is done via the `config.yaml`, see the [Configuration Wiki](https://github.com/gtsteffaniak/filebrowser/wiki/Configuration) for available configuration options and other help.
|
||||||
|
|
||||||
|
## Office File Support
|
||||||
|
|
||||||
|
See [Office Support Wiki](https://github.com/gtsteffaniak/filebrowser/wiki/Office-Support#adding-open-office-integration-for-docker) on how to enable office file editing.
|
||||||
|
|
||||||
## Migration from the original filebrowser
|
## Migration from the original filebrowser
|
||||||
|
|
||||||
|
@ -115,7 +118,7 @@ Self hostable | ✅ | ✅ | ✅ | ✅ | ❌ | ✅ |
|
||||||
Has Stable Release? | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ |
|
Has Stable Release? | ❌ | ✅ | ✅ | ✅ | ✅ | ✅ |
|
||||||
S3 support | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
S3 support | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
||||||
webdav support | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
webdav support | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
||||||
ftp support | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
FTP support | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
||||||
Dedicated docs site? | ❌ | ✅ | ✅ | ✅ | ❌ | ✅ |
|
Dedicated docs site? | ❌ | ✅ | ✅ | ✅ | ❌ | ✅ |
|
||||||
Multiple sources at once | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
Multiple sources at once | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
||||||
Docker image size | 31 MB | 31 MB | 240 MB (main image) | 250 MB | ❌ | > 2 GB |
|
Docker image size | 31 MB | 31 MB | 240 MB (main image) | 250 MB | ❌ | > 2 GB |
|
||||||
|
@ -142,17 +145,16 @@ Event-based notifications | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
|
||||||
Metrics | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
|
Metrics | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
|
||||||
file space quotas | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ |
|
file space quotas | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ |
|
||||||
text-based files editor | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
|
text-based files editor | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
|
||||||
office file support | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ |
|
office file support | ✅ | ❌ | ✅ | ✅ | ✅ | ✅ |
|
||||||
|
Office file previews | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
||||||
Themes | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
|
Themes | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
|
||||||
Branding support | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
|
Branding support | ✅ | ✅ | ❌ | ❌ | ❌ | ✅ |
|
||||||
activity log | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
activity log | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
||||||
Comments support | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
Comments support | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
||||||
collaboration on same file | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
|
||||||
trash support | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
trash support | ❌ | ❌ | ❌ | ✅ | ✅ | ✅ |
|
||||||
Starred/pinned files | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ |
|
Starred/pinned files | ❌ | ❌ | ❌ | ❌ | ✅ | ✅ |
|
||||||
Content preview icons | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ |
|
Content preview icons | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ |
|
||||||
Plugins support | ❌ | ❌ | ✅ | ✅ | ❌ | ✅ |
|
|
||||||
Chromecast support | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ |
|
Chromecast support | ❌ | ❌ | ✅ | ❌ | ❌ | ❌ |
|
||||||
Share collections of files | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
|
Share collections of files | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
|
||||||
Can archive selected files | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
|
Can archive selected files | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ |
|
||||||
Can browse archive files | ❌ | ❌ | ❌ | ❌ | ❌ | ✅
|
Can browse archive files | ❌ | ❌ | ❌ | ❌ | ❌ | ✅
|
||||||
|
|
|
@ -3,13 +3,13 @@ package auth
|
||||||
import (
|
import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
"net/http"
|
"net/http"
|
||||||
"os"
|
"os"
|
||||||
"os/exec"
|
"os/exec"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
"github.com/gtsteffaniak/filebrowser/backend/users"
|
||||||
)
|
)
|
||||||
|
@ -164,7 +164,7 @@ func (a *HookAuth) SaveUser() (*users.User, error) {
|
||||||
return nil, fmt.Errorf("user: failed to mkdir user home dir: [%s]", userHome)
|
return nil, fmt.Errorf("user: failed to mkdir user home dir: [%s]", userHome)
|
||||||
}
|
}
|
||||||
u.Scope = userHome
|
u.Scope = userHome
|
||||||
log.Printf("user: %s, home dir: [%s].", u.Username, userHome)
|
logger.Debug(fmt.Sprintf("user: %s, home dir: [%s].", u.Username, userHome))
|
||||||
|
|
||||||
err = a.Users.Save(u)
|
err = a.Users.Save(u)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
package utils
|
package cache
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"sync"
|
"sync"
|
||||||
|
@ -6,12 +6,13 @@ import (
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
DiskUsageCache = newCache(30*time.Second, 24*time.Hour)
|
DiskUsage = NewCache(30*time.Second, 24*time.Hour)
|
||||||
RealPathCache = newCache(48*time.Hour, 72*time.Hour)
|
RealPath = NewCache(48*time.Hour, 72*time.Hour)
|
||||||
SearchResultsCache = newCache(15*time.Second, time.Hour)
|
SearchResults = NewCache(15*time.Second, time.Hour)
|
||||||
|
OnlyOffice = NewCache(48*time.Hour, 1*time.Hour)
|
||||||
)
|
)
|
||||||
|
|
||||||
func newCache(expires time.Duration, cleanup time.Duration) *KeyCache {
|
func NewCache(expires time.Duration, cleanup time.Duration) *KeyCache {
|
||||||
newCache := KeyCache{
|
newCache := KeyCache{
|
||||||
data: make(map[string]cachedValue),
|
data: make(map[string]cachedValue),
|
||||||
expiresAfter: expires, // default
|
expiresAfter: expires, // default
|
||||||
|
@ -40,6 +41,12 @@ func (c *KeyCache) Set(key string, value interface{}) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (c *KeyCache) Delete(key string) {
|
||||||
|
c.mu.Lock()
|
||||||
|
defer c.mu.Unlock()
|
||||||
|
delete(c.data, key)
|
||||||
|
}
|
||||||
|
|
||||||
func (c *KeyCache) SetWithExp(key string, value interface{}, exp time.Duration) {
|
func (c *KeyCache) SetWithExp(key string, value interface{}, exp time.Duration) {
|
||||||
c.mu.Lock()
|
c.mu.Lock()
|
||||||
defer c.mu.Unlock()
|
defer c.mu.Unlock()
|
|
@ -3,7 +3,6 @@ package cmd
|
||||||
import (
|
import (
|
||||||
"flag"
|
"flag"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
|
@ -11,6 +10,7 @@ import (
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/files"
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
fbhttp "github.com/gtsteffaniak/filebrowser/backend/http"
|
fbhttp "github.com/gtsteffaniak/filebrowser/backend/http"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/img"
|
"github.com/gtsteffaniak/filebrowser/backend/img"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/swagger/docs"
|
"github.com/gtsteffaniak/filebrowser/backend/swagger/docs"
|
||||||
|
@ -25,21 +25,22 @@ func getStore(config string) (*storage.Storage, bool) {
|
||||||
settings.Initialize(config)
|
settings.Initialize(config)
|
||||||
store, hasDB, err := storage.InitializeDb(settings.Config.Server.Database)
|
store, hasDB, err := storage.InitializeDb(settings.Config.Server.Database)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatal("could not load db info: ", err)
|
logger.Fatal(fmt.Sprintf("could not load db info: %v", err))
|
||||||
}
|
}
|
||||||
return store, hasDB
|
return store, hasDB
|
||||||
}
|
}
|
||||||
|
|
||||||
func generalUsage() {
|
func generalUsage() {
|
||||||
fmt.Printf(`usage: ./filebrowser <command> [options]
|
fmt.Printf(`usage: ./filebrowser <command> [options]
|
||||||
commands:
|
commands:
|
||||||
-v Print the version
|
-h Print help
|
||||||
-c Print the default config file
|
-c Print the default config file
|
||||||
set -u Username and password for the new user
|
version Print version information
|
||||||
set -a Create user as admin
|
set -u Username and password for the new user
|
||||||
set -s Specify a user scope
|
set -a Create user as admin
|
||||||
set -h Print this help message
|
set -s Specify a user scope
|
||||||
` + "\n")
|
set -h Print this help message
|
||||||
|
`)
|
||||||
}
|
}
|
||||||
|
|
||||||
func StartFilebrowser() {
|
func StartFilebrowser() {
|
||||||
|
@ -90,9 +91,9 @@ func StartFilebrowser() {
|
||||||
getStore(dbConfig)
|
getStore(dbConfig)
|
||||||
// Create the user logic
|
// Create the user logic
|
||||||
if asAdmin {
|
if asAdmin {
|
||||||
log.Printf("Creating user as admin: %s\n", username)
|
logger.Info(fmt.Sprintf("Creating user as admin: %s\n", username))
|
||||||
} else {
|
} else {
|
||||||
log.Printf("Creating user: %s\n", username)
|
logger.Info(fmt.Sprintf("Creating non-admin user: %s\n", username))
|
||||||
}
|
}
|
||||||
newUser := users.User{
|
newUser := users.User{
|
||||||
Username: username,
|
Username: username,
|
||||||
|
@ -103,14 +104,15 @@ func StartFilebrowser() {
|
||||||
}
|
}
|
||||||
err = storage.CreateUser(newUser, asAdmin)
|
err = storage.CreateUser(newUser, asAdmin)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatal("Could not create user: ", err)
|
logger.Fatal(fmt.Sprintf("could not create user: %v", err))
|
||||||
}
|
}
|
||||||
return
|
return
|
||||||
case "version":
|
case "version":
|
||||||
fmt.Println("FileBrowser Quantum - A modern web-based file manager")
|
fmt.Printf(`FileBrowser Quantum - A modern web-based file manager
|
||||||
fmt.Printf("Version : %v\n", version.Version)
|
Version : %v
|
||||||
fmt.Printf("Commit : %v\n", version.CommitSHA)
|
Commit : %v
|
||||||
fmt.Printf("Release Info : https://github.com/gtsteffaniak/filebrowser/releases/tag/%v\n", version.Version)
|
Release Info : https://github.com/gtsteffaniak/filebrowser/releases/tag/%v
|
||||||
|
`, version.Version, version.CommitSHA, version.Version)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -119,15 +121,15 @@ func StartFilebrowser() {
|
||||||
if !dbExists {
|
if !dbExists {
|
||||||
database = fmt.Sprintf("Creating new database : %v", settings.Config.Server.Database)
|
database = fmt.Sprintf("Creating new database : %v", settings.Config.Server.Database)
|
||||||
}
|
}
|
||||||
log.Printf("Initializing FileBrowser Quantum (%v)\n", version.Version)
|
|
||||||
log.Printf("Using Config file : %v", configPath)
|
|
||||||
log.Println("Embeded frontend :", os.Getenv("FILEBROWSER_NO_EMBEDED") != "true")
|
|
||||||
log.Println(database)
|
|
||||||
sources := []string{}
|
sources := []string{}
|
||||||
for _, v := range settings.Config.Server.Sources {
|
for _, v := range settings.Config.Server.Sources {
|
||||||
sources = append(sources, v.Name+": "+v.Path)
|
sources = append(sources, v.Name+": "+v.Path)
|
||||||
}
|
}
|
||||||
log.Println("Sources :", sources)
|
logger.Info(fmt.Sprintf("Initializing FileBrowser Quantum (%v)", version.Version))
|
||||||
|
logger.Info(fmt.Sprintf("Using Config file : %v", configPath))
|
||||||
|
logger.Debug(fmt.Sprintf("Embeded frontend : %v", os.Getenv("FILEBROWSER_NO_EMBEDED") != "true"))
|
||||||
|
logger.Info(database)
|
||||||
|
logger.Info(fmt.Sprintf("Sources : %v", sources))
|
||||||
|
|
||||||
serverConfig := settings.Config.Server
|
serverConfig := settings.Config.Server
|
||||||
swagInfo := docs.SwaggerInfo
|
swagInfo := docs.SwaggerInfo
|
||||||
|
@ -136,19 +138,19 @@ func StartFilebrowser() {
|
||||||
// initialize indexing and schedule indexing ever n minutes (default 5)
|
// initialize indexing and schedule indexing ever n minutes (default 5)
|
||||||
sourceConfigs := settings.Config.Server.Sources
|
sourceConfigs := settings.Config.Server.Sources
|
||||||
if len(sourceConfigs) == 0 {
|
if len(sourceConfigs) == 0 {
|
||||||
log.Fatal("No sources configured, exiting...")
|
logger.Fatal("No sources configured, exiting...")
|
||||||
}
|
}
|
||||||
for _, source := range sourceConfigs {
|
for _, source := range sourceConfigs {
|
||||||
go files.Initialize(source)
|
go files.Initialize(source)
|
||||||
}
|
}
|
||||||
if err := rootCMD(store, &serverConfig); err != nil {
|
if err := rootCMD(store, &serverConfig); err != nil {
|
||||||
log.Fatal("Error starting filebrowser:", err)
|
logger.Fatal(fmt.Sprintf("Error starting filebrowser: %v", err))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func rootCMD(store *storage.Storage, serverConfig *settings.Server) error {
|
func rootCMD(store *storage.Storage, serverConfig *settings.Server) error {
|
||||||
if serverConfig.NumImageProcessors < 1 {
|
if serverConfig.NumImageProcessors < 1 {
|
||||||
log.Fatal("Image resize workers count could not be < 1")
|
logger.Fatal("Image resize workers count could not be < 1")
|
||||||
}
|
}
|
||||||
imgSvc := img.New(serverConfig.NumImageProcessors)
|
imgSvc := img.New(serverConfig.NumImageProcessors)
|
||||||
|
|
||||||
|
@ -160,7 +162,7 @@ func rootCMD(store *storage.Storage, serverConfig *settings.Server) error {
|
||||||
var err error
|
var err error
|
||||||
fileCache, err = diskcache.NewFileCache(cacheDir)
|
fileCache, err = diskcache.NewFileCache(cacheDir)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("failed to create file cache: %v", err)
|
logger.Fatal(fmt.Sprintf("failed to create file cache: %v", err))
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// No-op cache if no cacheDir is specified
|
// No-op cache if no cacheDir is specified
|
||||||
|
|
|
@ -1,68 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"strconv"
|
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
rulesCmd.AddCommand(rulesRmCommand)
|
|
||||||
rulesRmCommand.Flags().Uint("index", 0, "index of rule to remove")
|
|
||||||
_ = rulesRmCommand.MarkFlagRequired("index")
|
|
||||||
}
|
|
||||||
|
|
||||||
var rulesRmCommand = &cobra.Command{
|
|
||||||
Use: "rm <index> [index_end]",
|
|
||||||
Short: "Remove a global rule or user rule",
|
|
||||||
Long: `Remove a global rule or user rule. The provided index
|
|
||||||
is the same that's printed when you run 'rules ls'. Note
|
|
||||||
that after each removal/addition, the index of the
|
|
||||||
commands change. So be careful when removing them after each
|
|
||||||
other.
|
|
||||||
|
|
||||||
You can also specify an optional parameter (index_end) so
|
|
||||||
you can remove all commands from 'index' to 'index_end',
|
|
||||||
including 'index_end'.`,
|
|
||||||
Args: func(cmd *cobra.Command, args []string) error {
|
|
||||||
if err := cobra.RangeArgs(1, 2)(cmd, args); err != nil { //nolint:gomnd
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
|
|
||||||
for _, arg := range args {
|
|
||||||
if _, err := strconv.Atoi(arg); err != nil {
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return nil
|
|
||||||
},
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
i, err := strconv.Atoi(args[0])
|
|
||||||
utils.CheckErr("strconv.Atoi", err)
|
|
||||||
f := i
|
|
||||||
if len(args) == 2 { //nolint:gomnd
|
|
||||||
f, err = strconv.Atoi(args[1])
|
|
||||||
utils.CheckErr("strconv.Atoi", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
user := func(u *users.User) {
|
|
||||||
u.Rules = append(u.Rules[:i], u.Rules[f+1:]...)
|
|
||||||
err := store.Users.Save(u)
|
|
||||||
utils.CheckErr("store.Users.Save", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
global := func(s *settings.Settings) {
|
|
||||||
s.Rules = append(s.Rules[:i], s.Rules[f+1:]...)
|
|
||||||
err := store.Settings.Save(s)
|
|
||||||
utils.CheckErr("store.Settings.Save", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
runRules(store, cmd, user, global)
|
|
||||||
}),
|
|
||||||
}
|
|
|
@ -1,86 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
"github.com/spf13/pflag"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
rulesCmd.PersistentFlags().StringP("username", "u", "", "username of user to which the rules apply")
|
|
||||||
rulesCmd.PersistentFlags().UintP("id", "i", 0, "id of user to which the rules apply")
|
|
||||||
}
|
|
||||||
|
|
||||||
var rulesCmd = &cobra.Command{
|
|
||||||
Use: "rules",
|
|
||||||
Short: "Rules management utility",
|
|
||||||
Long: `On each subcommand you'll have available at least two flags:
|
|
||||||
"username" and "id". You must either set only one of them
|
|
||||||
or none. If you set one of them, the command will apply to
|
|
||||||
an user, otherwise it will be applied to the global set or
|
|
||||||
rules.`,
|
|
||||||
Args: cobra.NoArgs,
|
|
||||||
}
|
|
||||||
|
|
||||||
func runRules(st *storage.Storage, cmd *cobra.Command, usersFn func(*users.User), globalFn func(*settings.Settings)) {
|
|
||||||
id := getUserIdentifier(cmd.Flags())
|
|
||||||
if id != nil {
|
|
||||||
user, err := st.Users.Get("", id)
|
|
||||||
utils.CheckErr("st.Users.Get", err)
|
|
||||||
|
|
||||||
if usersFn != nil {
|
|
||||||
usersFn(user)
|
|
||||||
}
|
|
||||||
|
|
||||||
printRules(user.Rules, id)
|
|
||||||
return
|
|
||||||
}
|
|
||||||
|
|
||||||
s, err := st.Settings.Get()
|
|
||||||
utils.CheckErr("st.Settings.Get", err)
|
|
||||||
|
|
||||||
if globalFn != nil {
|
|
||||||
globalFn(s)
|
|
||||||
}
|
|
||||||
|
|
||||||
printRules(s.Rules, id)
|
|
||||||
}
|
|
||||||
|
|
||||||
func getUserIdentifier(flags *pflag.FlagSet) interface{} {
|
|
||||||
id := mustGetUint(flags, "id")
|
|
||||||
username := mustGetString(flags, "username")
|
|
||||||
|
|
||||||
if id != 0 {
|
|
||||||
return id
|
|
||||||
} else if username != "" {
|
|
||||||
return username
|
|
||||||
}
|
|
||||||
|
|
||||||
return nil
|
|
||||||
}
|
|
||||||
|
|
||||||
func printRules(rulez []users.Rule, id interface{}) {
|
|
||||||
|
|
||||||
for id, rule := range rulez {
|
|
||||||
fmt.Printf("(%d) ", id)
|
|
||||||
if rule.Regex {
|
|
||||||
if rule.Allow {
|
|
||||||
fmt.Printf("Allow Regex: \t%s\n", rule.Regexp.Raw)
|
|
||||||
} else {
|
|
||||||
fmt.Printf("Disallow Regex: \t%s\n", rule.Regexp.Raw)
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
if rule.Allow {
|
|
||||||
fmt.Printf("Allow Path: \t%s\n", rule.Path)
|
|
||||||
} else {
|
|
||||||
fmt.Printf("Disallow Path: \t%s\n", rule.Path)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1,59 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"regexp"
|
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
rulesCmd.AddCommand(rulesAddCmd)
|
|
||||||
rulesAddCmd.Flags().BoolP("allow", "a", false, "indicates this is an allow rule")
|
|
||||||
rulesAddCmd.Flags().BoolP("regex", "r", false, "indicates this is a regex rule")
|
|
||||||
}
|
|
||||||
|
|
||||||
var rulesAddCmd = &cobra.Command{
|
|
||||||
Use: "add <path|expression>",
|
|
||||||
Short: "Add a global rule or user rule",
|
|
||||||
Long: `Add a global rule or user rule.`,
|
|
||||||
Args: cobra.ExactArgs(1),
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
allow := mustGetBool(cmd.Flags(), "allow")
|
|
||||||
regex := mustGetBool(cmd.Flags(), "regex")
|
|
||||||
exp := args[0]
|
|
||||||
|
|
||||||
if regex {
|
|
||||||
regexp.MustCompile(exp)
|
|
||||||
}
|
|
||||||
|
|
||||||
rule := users.Rule{
|
|
||||||
Allow: allow,
|
|
||||||
Regex: regex,
|
|
||||||
}
|
|
||||||
|
|
||||||
if regex {
|
|
||||||
rule.Regexp = &users.Regexp{Raw: exp}
|
|
||||||
} else {
|
|
||||||
rule.Path = exp
|
|
||||||
}
|
|
||||||
|
|
||||||
user := func(u *users.User) {
|
|
||||||
u.Rules = append(u.Rules, rule)
|
|
||||||
err := store.Users.Save(u)
|
|
||||||
utils.CheckErr("store.Users.Save", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
global := func(s *settings.Settings) {
|
|
||||||
s.Rules = append(s.Rules, rule)
|
|
||||||
err := store.Settings.Save(s)
|
|
||||||
utils.CheckErr("store.Settings.Save", err)
|
|
||||||
}
|
|
||||||
|
|
||||||
runRules(store, cmd, user, global)
|
|
||||||
}),
|
|
||||||
}
|
|
|
@ -1,20 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
rulesCmd.AddCommand(rulesLsCommand)
|
|
||||||
}
|
|
||||||
|
|
||||||
var rulesLsCommand = &cobra.Command{
|
|
||||||
Use: "ls",
|
|
||||||
Short: "List global rules or user specific rules",
|
|
||||||
Long: `List global rules or user specific rules.`,
|
|
||||||
Args: cobra.NoArgs,
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
runRules(store, cmd, nil, nil)
|
|
||||||
}),
|
|
||||||
}
|
|
|
@ -1,54 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"fmt"
|
|
||||||
"os"
|
|
||||||
"strconv"
|
|
||||||
"text/tabwriter"
|
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
)
|
|
||||||
|
|
||||||
var usersCmd = &cobra.Command{
|
|
||||||
Use: "users",
|
|
||||||
Short: "Users management utility",
|
|
||||||
Long: `Users management utility.`,
|
|
||||||
Args: cobra.NoArgs,
|
|
||||||
}
|
|
||||||
|
|
||||||
func printUsers(usrs []*users.User) {
|
|
||||||
w := tabwriter.NewWriter(os.Stdout, 0, 0, 2, ' ', 0) //nolint:gomnd
|
|
||||||
fmt.Fprintln(w, "ID\tUsername\tScope\tLocale\tV. Mode\tS.Click\tAdmin\tExecute\tCreate\tRename\tModify\tDelete\tShare\tDownload\tPwd Lock")
|
|
||||||
|
|
||||||
for _, u := range usrs {
|
|
||||||
fmt.Fprintf(w, "%d\t%s\t%s\t%s\t%s\t%t\t%t\t%t\t%t\t%t\t%t\t%t\t%t\t%t\t%t\t\n",
|
|
||||||
u.ID,
|
|
||||||
u.Username,
|
|
||||||
u.Scope,
|
|
||||||
u.Locale,
|
|
||||||
u.ViewMode,
|
|
||||||
u.SingleClick,
|
|
||||||
u.Perm.Admin,
|
|
||||||
u.Perm.Execute,
|
|
||||||
u.Perm.Create,
|
|
||||||
u.Perm.Rename,
|
|
||||||
u.Perm.Modify,
|
|
||||||
u.Perm.Delete,
|
|
||||||
u.Perm.Share,
|
|
||||||
u.Perm.Download,
|
|
||||||
u.LockPassword,
|
|
||||||
)
|
|
||||||
}
|
|
||||||
|
|
||||||
w.Flush()
|
|
||||||
}
|
|
||||||
|
|
||||||
func parseUsernameOrID(arg string) (username string, id uint) {
|
|
||||||
id64, err := strconv.ParseUint(arg, 10, 64)
|
|
||||||
if err != nil {
|
|
||||||
return arg, 0
|
|
||||||
}
|
|
||||||
return "", uint(id64)
|
|
||||||
}
|
|
|
@ -1,42 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
usersCmd.AddCommand(usersAddCmd)
|
|
||||||
}
|
|
||||||
|
|
||||||
var usersAddCmd = &cobra.Command{
|
|
||||||
Use: "add <username> <password>",
|
|
||||||
Short: "Create a new user",
|
|
||||||
Long: `Create a new user and add it to the database.`,
|
|
||||||
Args: cobra.ExactArgs(2), //nolint:gomnd
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
user := &users.User{
|
|
||||||
Username: args[0],
|
|
||||||
Password: args[1],
|
|
||||||
LockPassword: mustGetBool(cmd.Flags(), "lockPassword"),
|
|
||||||
}
|
|
||||||
servSettings, err := store.Settings.GetServer()
|
|
||||||
utils.CheckErr("store.Settings.GetServer()", err)
|
|
||||||
// since getUserDefaults() polluted s.Defaults.Scope
|
|
||||||
// which makes the Scope not the one saved in the db
|
|
||||||
// we need the right s.Defaults.Scope here
|
|
||||||
s2, err := store.Settings.Get()
|
|
||||||
utils.CheckErr("store.Settings.Get()", err)
|
|
||||||
|
|
||||||
userHome, err := s2.MakeUserDir(user.Username, user.Scope, servSettings.Root)
|
|
||||||
utils.CheckErr("s2.MakeUserDir", err)
|
|
||||||
user.Scope = userHome
|
|
||||||
|
|
||||||
err = store.Users.Save(user)
|
|
||||||
utils.CheckErr("store.Users.Save", err)
|
|
||||||
printUsers([]*users.User{user})
|
|
||||||
}),
|
|
||||||
}
|
|
|
@ -1,26 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
usersCmd.AddCommand(usersExportCmd)
|
|
||||||
}
|
|
||||||
|
|
||||||
var usersExportCmd = &cobra.Command{
|
|
||||||
Use: "export <path>",
|
|
||||||
Short: "Export all users to a file.",
|
|
||||||
Long: `Export all users to a json or yaml file. Please indicate the
|
|
||||||
path to the file where you want to write the users.`,
|
|
||||||
Args: jsonYamlArg,
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
list, err := store.Users.Gets("")
|
|
||||||
utils.CheckErr("store.Users.Gets", err)
|
|
||||||
|
|
||||||
err = marshal(args[0], list)
|
|
||||||
utils.CheckErr("marshal", err)
|
|
||||||
}),
|
|
||||||
}
|
|
|
@ -1,53 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
usersCmd.AddCommand(usersFindCmd)
|
|
||||||
usersCmd.AddCommand(usersLsCmd)
|
|
||||||
}
|
|
||||||
|
|
||||||
var usersFindCmd = &cobra.Command{
|
|
||||||
Use: "find <id|username>",
|
|
||||||
Short: "Find a user by username or id",
|
|
||||||
Long: `Find a user by username or id. If no flag is set, all users will be printed.`,
|
|
||||||
Args: cobra.ExactArgs(1),
|
|
||||||
Run: findUsers,
|
|
||||||
}
|
|
||||||
|
|
||||||
var usersLsCmd = &cobra.Command{
|
|
||||||
Use: "ls",
|
|
||||||
Short: "List all users.",
|
|
||||||
Args: cobra.NoArgs,
|
|
||||||
Run: findUsers,
|
|
||||||
}
|
|
||||||
|
|
||||||
var findUsers = cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
var (
|
|
||||||
list []*users.User
|
|
||||||
user *users.User
|
|
||||||
err error
|
|
||||||
)
|
|
||||||
|
|
||||||
if len(args) == 1 {
|
|
||||||
username, id := parseUsernameOrID(args[0])
|
|
||||||
if username != "" {
|
|
||||||
user, err = store.Users.Get("", username)
|
|
||||||
} else {
|
|
||||||
user, err = store.Users.Get("", id)
|
|
||||||
}
|
|
||||||
|
|
||||||
list = []*users.User{user}
|
|
||||||
} else {
|
|
||||||
list, err = store.Users.Gets("")
|
|
||||||
}
|
|
||||||
|
|
||||||
utils.CheckErr("findUsers", err)
|
|
||||||
printUsers(list)
|
|
||||||
})
|
|
|
@ -1,88 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"errors"
|
|
||||||
"fmt"
|
|
||||||
"os"
|
|
||||||
"strconv"
|
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
usersCmd.AddCommand(usersImportCmd)
|
|
||||||
usersImportCmd.Flags().Bool("overwrite", false, "overwrite users with the same id/username combo")
|
|
||||||
usersImportCmd.Flags().Bool("replace", false, "replace the entire user base")
|
|
||||||
}
|
|
||||||
|
|
||||||
var usersImportCmd = &cobra.Command{
|
|
||||||
Use: "import <path>",
|
|
||||||
Short: "Import users from a file",
|
|
||||||
Long: `Import users from a file. The path must be for a json or yaml
|
|
||||||
file. You can use this command to import new users to your
|
|
||||||
installation. For that, just don't place their ID on the files
|
|
||||||
list or set it to 0.`,
|
|
||||||
Args: jsonYamlArg,
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
fd, err := os.Open(args[0])
|
|
||||||
utils.CheckErr("os.Open", err)
|
|
||||||
defer fd.Close()
|
|
||||||
|
|
||||||
list := []*users.User{}
|
|
||||||
err = unmarshal(args[0], &list)
|
|
||||||
utils.CheckErr("unmarshal", err)
|
|
||||||
|
|
||||||
if mustGetBool(cmd.Flags(), "replace") {
|
|
||||||
oldUsers, err := store.Users.Gets("")
|
|
||||||
utils.CheckErr("store.Users.Gets", err)
|
|
||||||
|
|
||||||
err = marshal("users.backup.json", list)
|
|
||||||
utils.CheckErr("marshal users.backup.json", err)
|
|
||||||
|
|
||||||
for _, user := range oldUsers {
|
|
||||||
err = store.Users.Delete(user.ID)
|
|
||||||
utils.CheckErr("store.Users.Delete", err)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
overwrite := mustGetBool(cmd.Flags(), "overwrite")
|
|
||||||
|
|
||||||
for _, user := range list {
|
|
||||||
onDB, err := store.Users.Get("", user.ID)
|
|
||||||
|
|
||||||
// User exists in DB.
|
|
||||||
if err == nil {
|
|
||||||
if !overwrite {
|
|
||||||
newErr := errors.New("user " + strconv.Itoa(int(user.ID)) + " is already registered")
|
|
||||||
utils.CheckErr("", newErr)
|
|
||||||
}
|
|
||||||
|
|
||||||
// If the usernames mismatch, check if there is another one in the DB
|
|
||||||
// with the new username. If there is, print an error and cancel the
|
|
||||||
// operation
|
|
||||||
if user.Username != onDB.Username {
|
|
||||||
if conflictuous, err := store.Users.Get("", user.Username); err == nil { //nolint:govet
|
|
||||||
newErr := usernameConflictError(user.Username, conflictuous.ID, user.ID)
|
|
||||||
utils.CheckErr("usernameConflictError", newErr)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
} else {
|
|
||||||
// If it doesn't exist, set the ID to 0 to automatically get a new
|
|
||||||
// one that make sense in this DB.
|
|
||||||
user.ID = 0
|
|
||||||
}
|
|
||||||
|
|
||||||
err = store.Users.Save(user)
|
|
||||||
utils.CheckErr("store.Users.Save", err)
|
|
||||||
}
|
|
||||||
}),
|
|
||||||
}
|
|
||||||
|
|
||||||
func usernameConflictError(username string, originalID, newID uint) error {
|
|
||||||
return fmt.Errorf(`can't import user with ID %d and username "%s" because the username is already registred with the user %d`,
|
|
||||||
newID, username, originalID)
|
|
||||||
}
|
|
|
@ -1,33 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"log"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
usersCmd.AddCommand(usersRmCmd)
|
|
||||||
}
|
|
||||||
|
|
||||||
var usersRmCmd = &cobra.Command{
|
|
||||||
Use: "rm <id|username>",
|
|
||||||
Short: "Delete a user by username or id",
|
|
||||||
Long: `Delete a user by username or id`,
|
|
||||||
Args: cobra.ExactArgs(1),
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
username, id := parseUsernameOrID(args[0])
|
|
||||||
var err error
|
|
||||||
|
|
||||||
if username != "" {
|
|
||||||
err = store.Users.Delete(username)
|
|
||||||
} else {
|
|
||||||
err = store.Users.Delete(id)
|
|
||||||
}
|
|
||||||
|
|
||||||
utils.CheckErr("usersRmCmd", err)
|
|
||||||
log.Println("user deleted successfully")
|
|
||||||
}),
|
|
||||||
}
|
|
|
@ -1,40 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func init() {
|
|
||||||
usersCmd.AddCommand(usersUpdateCmd)
|
|
||||||
}
|
|
||||||
|
|
||||||
var usersUpdateCmd = &cobra.Command{
|
|
||||||
Use: "update <id|username>",
|
|
||||||
Short: "Updates an existing user",
|
|
||||||
Long: `Updates an existing user. Set the flags for the
|
|
||||||
options you want to change.`,
|
|
||||||
Args: cobra.ExactArgs(1),
|
|
||||||
Run: cobraCmd(func(cmd *cobra.Command, args []string, store *storage.Storage) {
|
|
||||||
username, id := parseUsernameOrID(args[0])
|
|
||||||
|
|
||||||
var (
|
|
||||||
err error
|
|
||||||
user *users.User
|
|
||||||
)
|
|
||||||
|
|
||||||
if id != 0 {
|
|
||||||
user, err = store.Users.Get("", id)
|
|
||||||
} else {
|
|
||||||
user, err = store.Users.Get("", username)
|
|
||||||
}
|
|
||||||
utils.CheckErr("store.Users.Get", err)
|
|
||||||
|
|
||||||
err = store.Users.Update(user)
|
|
||||||
utils.CheckErr("store.Users.Update", err)
|
|
||||||
printUsers([]*users.User{user})
|
|
||||||
}),
|
|
||||||
}
|
|
|
@ -1,88 +0,0 @@
|
||||||
package cmd
|
|
||||||
|
|
||||||
import (
|
|
||||||
"encoding/json"
|
|
||||||
"errors"
|
|
||||||
"os"
|
|
||||||
"path/filepath"
|
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
|
||||||
"github.com/spf13/cobra"
|
|
||||||
"github.com/spf13/pflag"
|
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
|
||||||
|
|
||||||
func mustGetString(flags *pflag.FlagSet, flag string) string {
|
|
||||||
s, err := flags.GetString(flag)
|
|
||||||
utils.CheckErr("mustGetString", err)
|
|
||||||
return s
|
|
||||||
}
|
|
||||||
|
|
||||||
func mustGetBool(flags *pflag.FlagSet, flag string) bool {
|
|
||||||
b, err := flags.GetBool(flag)
|
|
||||||
utils.CheckErr("mustGetBool", err)
|
|
||||||
return b
|
|
||||||
}
|
|
||||||
|
|
||||||
func mustGetUint(flags *pflag.FlagSet, flag string) uint {
|
|
||||||
b, err := flags.GetUint(flag)
|
|
||||||
utils.CheckErr("mustGetUint", err)
|
|
||||||
return b
|
|
||||||
}
|
|
||||||
|
|
||||||
type cobraFunc func(cmd *cobra.Command, args []string)
|
|
||||||
type pythonFunc func(cmd *cobra.Command, args []string, store *storage.Storage)
|
|
||||||
|
|
||||||
func marshal(filename string, data interface{}) error {
|
|
||||||
fd, err := os.Create(filename)
|
|
||||||
|
|
||||||
utils.CheckErr("os.Create", err)
|
|
||||||
defer fd.Close()
|
|
||||||
|
|
||||||
switch ext := filepath.Ext(filename); ext {
|
|
||||||
case ".json":
|
|
||||||
encoder := json.NewEncoder(fd)
|
|
||||||
encoder.SetIndent("", " ")
|
|
||||||
return encoder.Encode(data)
|
|
||||||
case ".yml", ".yaml": //nolint:goconst
|
|
||||||
_, err := yaml.Marshal(fd)
|
|
||||||
return err
|
|
||||||
default:
|
|
||||||
return errors.New("invalid format: " + ext)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func unmarshal(filename string, data interface{}) error {
|
|
||||||
fd, err := os.Open(filename)
|
|
||||||
utils.CheckErr("os.Open", err)
|
|
||||||
defer fd.Close()
|
|
||||||
|
|
||||||
switch ext := filepath.Ext(filename); ext {
|
|
||||||
case ".json":
|
|
||||||
return json.NewDecoder(fd).Decode(data)
|
|
||||||
case ".yml", ".yaml":
|
|
||||||
return yaml.NewDecoder(fd).Decode(data)
|
|
||||||
default:
|
|
||||||
return errors.New("invalid format: " + ext)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func jsonYamlArg(cmd *cobra.Command, args []string) error {
|
|
||||||
if err := cobra.ExactArgs(1)(cmd, args); err != nil {
|
|
||||||
return err
|
|
||||||
}
|
|
||||||
|
|
||||||
switch ext := filepath.Ext(args[0]); ext {
|
|
||||||
case ".json", ".yml", ".yaml":
|
|
||||||
return nil
|
|
||||||
default:
|
|
||||||
return errors.New("invalid format: " + ext)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
func cobraCmd(fn pythonFunc) cobraFunc {
|
|
||||||
return func(cmd *cobra.Command, args []string) {
|
|
||||||
}
|
|
||||||
}
|
|
Binary file not shown.
|
@ -1,7 +1,7 @@
|
||||||
server:
|
server:
|
||||||
port: 80
|
port: 80
|
||||||
baseURL: "/"
|
baseURL: "/"
|
||||||
root: "./tests/playwright-files"
|
root: "../frontend/tests/playwright-files"
|
||||||
auth:
|
auth:
|
||||||
method: password
|
method: password
|
||||||
signup: false
|
signup: false
|
||||||
|
|
|
@ -2,6 +2,7 @@ package files
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"mime"
|
"mime"
|
||||||
|
"path/filepath"
|
||||||
"regexp"
|
"regexp"
|
||||||
"strconv"
|
"strconv"
|
||||||
"strings"
|
"strings"
|
||||||
|
@ -52,6 +53,17 @@ var documentTypes = []string{
|
||||||
".fb2", // FictionBook
|
".fb2", // FictionBook
|
||||||
}
|
}
|
||||||
|
|
||||||
|
var onlyOfficeSupported = []string{
|
||||||
|
".doc", ".docm", ".docx", ".dot", ".dotm", ".dotx", ".epub",
|
||||||
|
".fb2", ".fodt", ".htm", ".html", ".mht", ".mhtml", ".odt",
|
||||||
|
".ott", ".rtf", ".stw", ".sxw", ".txt", ".wps", ".wpt", ".xml",
|
||||||
|
".csv", ".et", ".ett", ".fods", ".ods", ".ots", ".sxc", ".xls",
|
||||||
|
".xlsb", ".xlsm", ".xlsx", ".xlt", ".xltm", ".xltx", ".dps",
|
||||||
|
".dpt", ".fodp", ".odp", ".otp", ".pot", ".potm", ".potx",
|
||||||
|
".pps", ".ppsm", ".ppsx", ".ppt", ".pptm", ".pptx", ".sxi",
|
||||||
|
".djvu", ".docxf", ".oform", ".oxps", ".pdf", ".xps",
|
||||||
|
}
|
||||||
|
|
||||||
// Text-based file extensions
|
// Text-based file extensions
|
||||||
var textTypes = []string{
|
var textTypes = []string{
|
||||||
// Common Text Formats
|
// Common Text Formats
|
||||||
|
@ -256,3 +268,13 @@ func isArchive(extension string) bool {
|
||||||
}
|
}
|
||||||
return false
|
return false
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func isOnlyOffice(name string) bool {
|
||||||
|
extention := filepath.Ext(name)
|
||||||
|
for _, typefile := range onlyOfficeSupported {
|
||||||
|
if extention == typefile {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
|
@ -21,9 +21,12 @@ import (
|
||||||
"time"
|
"time"
|
||||||
"unicode/utf8"
|
"unicode/utf8"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/cache"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/fileutils"
|
"github.com/gtsteffaniak/filebrowser/backend/fileutils"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
"github.com/gtsteffaniak/filebrowser/backend/users"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
@ -32,30 +35,32 @@ var (
|
||||||
)
|
)
|
||||||
|
|
||||||
type ItemInfo struct {
|
type ItemInfo struct {
|
||||||
Name string `json:"name"`
|
Name string `json:"name"` // name of the file
|
||||||
Size int64 `json:"size"`
|
Size int64 `json:"size"` // length in bytes for regular files
|
||||||
ModTime time.Time `json:"modified"`
|
ModTime time.Time `json:"modified"` // modification time
|
||||||
Type string `json:"type"`
|
Type string `json:"type"` // type of the file, either "directory" or a file mimetype
|
||||||
}
|
}
|
||||||
|
|
||||||
// FileInfo describes a file.
|
// FileInfo describes a file.
|
||||||
// reduced item is non-recursive reduced "Items", used to pass flat items array
|
// reduced item is non-recursive reduced "Items", used to pass flat items array
|
||||||
type FileInfo struct {
|
type FileInfo struct {
|
||||||
ItemInfo
|
ItemInfo
|
||||||
Files []ItemInfo `json:"files"`
|
Files []ItemInfo `json:"files"` // files in the directory
|
||||||
Folders []ItemInfo `json:"folders"`
|
Folders []ItemInfo `json:"folders"` // folders in the directory
|
||||||
Path string `json:"path"`
|
Path string `json:"path"` // path scoped to the associated index
|
||||||
}
|
}
|
||||||
|
|
||||||
// for efficiency, a response will be a pointer to the data
|
// for efficiency, a response will be a pointer to the data
|
||||||
// extra calculated fields can be added here
|
// extra calculated fields can be added here
|
||||||
type ExtendedFileInfo struct {
|
type ExtendedFileInfo struct {
|
||||||
*FileInfo
|
*FileInfo
|
||||||
Content string `json:"content,omitempty"`
|
Content string `json:"content,omitempty"` // text content of a file, if requested
|
||||||
Subtitles []string `json:"subtitles,omitempty"`
|
Subtitles []string `json:"subtitles,omitempty"` // subtitles for video files
|
||||||
Checksums map[string]string `json:"checksums,omitempty"`
|
Checksums map[string]string `json:"checksums,omitempty"` // checksums for the file
|
||||||
Token string `json:"token,omitempty"`
|
Token string `json:"token,omitempty"` // token for the file -- used for sharing
|
||||||
RealPath string `json:"-"`
|
OnlyOfficeId string `json:"onlyOfficeId,omitempty"` // id for onlyoffice files
|
||||||
|
Source string `json:"source"` // associated index source for the file
|
||||||
|
RealPath string `json:"-"`
|
||||||
}
|
}
|
||||||
|
|
||||||
// FileOptions are the options when getting a file info.
|
// FileOptions are the options when getting a file info.
|
||||||
|
@ -134,9 +139,23 @@ func FileInfoFaster(opts FileOptions) (ExtendedFileInfo, error) {
|
||||||
}
|
}
|
||||||
response.FileInfo = info
|
response.FileInfo = info
|
||||||
response.RealPath = realPath
|
response.RealPath = realPath
|
||||||
|
if settings.Config.Integrations.OnlyOffice.Secret != "" && info.Type != "directory" && isOnlyOffice(info.Name) {
|
||||||
|
response.OnlyOfficeId = generateOfficeId(realPath)
|
||||||
|
}
|
||||||
return response, nil
|
return response, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func generateOfficeId(realPath string) string {
|
||||||
|
key, ok := cache.OnlyOffice.Get(realPath).(string)
|
||||||
|
if !ok {
|
||||||
|
timestamp := strconv.FormatInt(time.Now().UnixMilli(), 10)
|
||||||
|
documentKey := utils.HashSHA256(realPath + timestamp)
|
||||||
|
cache.OnlyOffice.Set(realPath, documentKey)
|
||||||
|
return documentKey
|
||||||
|
}
|
||||||
|
return key
|
||||||
|
}
|
||||||
|
|
||||||
// Checksum checksums a given File for a given User, using a specific
|
// Checksum checksums a given File for a given User, using a specific
|
||||||
// algorithm. The checksums data is saved on File object.
|
// algorithm. The checksums data is saved on File object.
|
||||||
func GetChecksum(fullPath, algo string) (map[string]string, error) {
|
func GetChecksum(fullPath, algo string) (map[string]string, error) {
|
||||||
|
|
|
@ -2,7 +2,6 @@ package files
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"slices"
|
"slices"
|
||||||
|
@ -10,6 +9,8 @@ import (
|
||||||
"sync"
|
"sync"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/cache"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
||||||
)
|
)
|
||||||
|
@ -52,11 +53,11 @@ func Initialize(source settings.Source) {
|
||||||
|
|
||||||
if !newIndex.Source.Config.Disabled {
|
if !newIndex.Source.Config.Disabled {
|
||||||
time.Sleep(time.Second)
|
time.Sleep(time.Second)
|
||||||
log.Println("Initializing index and assessing file system complexity")
|
logger.Info("Initializing index and assessing file system complexity")
|
||||||
newIndex.RunIndexing("/", false)
|
newIndex.RunIndexing("/", false)
|
||||||
go newIndex.setupIndexingScanners()
|
go newIndex.setupIndexingScanners()
|
||||||
} else {
|
} else {
|
||||||
log.Println("Indexing disabled for source: ", newIndex.Source.Name)
|
logger.Debug("Indexing disabled for source: " + newIndex.Source.Name)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -96,7 +97,7 @@ func (idx *Index) indexDirectory(adjustedPath string, quick, recursive bool) err
|
||||||
for _, item := range cacheDirItems {
|
for _, item := range cacheDirItems {
|
||||||
err = idx.indexDirectory(combinedPath+item.Name, quick, true)
|
err = idx.indexDirectory(combinedPath+item.Name, quick, true)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
fmt.Printf("error indexing directory %v : %v", combinedPath+item.Name, err)
|
logger.Error(fmt.Sprintf("error indexing directory %v : %v", combinedPath+item.Name, err))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return nil
|
return nil
|
||||||
|
@ -147,7 +148,7 @@ func (idx *Index) indexDirectory(adjustedPath string, quick, recursive bool) err
|
||||||
// Recursively index the subdirectory
|
// Recursively index the subdirectory
|
||||||
err = idx.indexDirectory(dirPath, quick, recursive)
|
err = idx.indexDirectory(dirPath, quick, recursive)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Printf("Failed to index directory %s: %v", dirPath, err)
|
logger.Error(fmt.Sprintf("Failed to index directory %s: %v", dirPath, err))
|
||||||
continue
|
continue
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -226,8 +227,8 @@ func (idx *Index) recursiveUpdateDirSizes(childInfo *FileInfo, previousSize int6
|
||||||
func (idx *Index) GetRealPath(relativePath ...string) (string, bool, error) {
|
func (idx *Index) GetRealPath(relativePath ...string) (string, bool, error) {
|
||||||
combined := append([]string{idx.Source.Path}, relativePath...)
|
combined := append([]string{idx.Source.Path}, relativePath...)
|
||||||
joinedPath := filepath.Join(combined...)
|
joinedPath := filepath.Join(combined...)
|
||||||
isDir, _ := utils.RealPathCache.Get(joinedPath + ":isdir").(bool)
|
isDir, _ := cache.RealPath.Get(joinedPath + ":isdir").(bool)
|
||||||
cached, ok := utils.RealPathCache.Get(joinedPath).(string)
|
cached, ok := cache.RealPath.Get(joinedPath).(string)
|
||||||
if ok && cached != "" {
|
if ok && cached != "" {
|
||||||
return cached, isDir, nil
|
return cached, isDir, nil
|
||||||
}
|
}
|
||||||
|
@ -239,8 +240,8 @@ func (idx *Index) GetRealPath(relativePath ...string) (string, bool, error) {
|
||||||
// Resolve symlinks and get the real path
|
// Resolve symlinks and get the real path
|
||||||
realPath, isDir, err := resolveSymlinks(absolutePath)
|
realPath, isDir, err := resolveSymlinks(absolutePath)
|
||||||
if err == nil {
|
if err == nil {
|
||||||
utils.RealPathCache.Set(joinedPath, realPath)
|
cache.RealPath.Set(joinedPath, realPath)
|
||||||
utils.RealPathCache.Set(joinedPath+":isdir", isDir)
|
cache.RealPath.Set(joinedPath+":isdir", isDir)
|
||||||
}
|
}
|
||||||
return realPath, isDir, err
|
return realPath, isDir, err
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,8 +1,10 @@
|
||||||
package files
|
package files
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"log"
|
"fmt"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
)
|
)
|
||||||
|
|
||||||
// schedule in minutes
|
// schedule in minutes
|
||||||
|
@ -32,7 +34,7 @@ func (idx *Index) newScanner(origin string) {
|
||||||
}
|
}
|
||||||
|
|
||||||
// Log and sleep before indexing
|
// Log and sleep before indexing
|
||||||
log.Printf("Next scan in %v\n", sleepTime)
|
logger.Debug(fmt.Sprintf("Next scan in %v\n", sleepTime))
|
||||||
time.Sleep(sleepTime)
|
time.Sleep(sleepTime)
|
||||||
|
|
||||||
idx.scannerMu.Lock()
|
idx.scannerMu.Lock()
|
||||||
|
@ -74,9 +76,9 @@ func (idx *Index) RunIndexing(origin string, quick bool) {
|
||||||
prevNumDirs := idx.NumDirs
|
prevNumDirs := idx.NumDirs
|
||||||
prevNumFiles := idx.NumFiles
|
prevNumFiles := idx.NumFiles
|
||||||
if quick {
|
if quick {
|
||||||
log.Println("Starting quick scan")
|
logger.Debug("Starting quick scan")
|
||||||
} else {
|
} else {
|
||||||
log.Println("Starting full scan")
|
logger.Debug("Starting full scan")
|
||||||
idx.NumDirs = 0
|
idx.NumDirs = 0
|
||||||
idx.NumFiles = 0
|
idx.NumFiles = 0
|
||||||
}
|
}
|
||||||
|
@ -85,8 +87,9 @@ func (idx *Index) RunIndexing(origin string, quick bool) {
|
||||||
// Perform the indexing operation
|
// Perform the indexing operation
|
||||||
err := idx.indexDirectory("/", quick, true)
|
err := idx.indexDirectory("/", quick, true)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Printf("Error during indexing: %v", err)
|
logger.Error(fmt.Sprintf("Error during indexing: %v", err))
|
||||||
}
|
}
|
||||||
|
firstRun := idx.LastIndexed == time.Time{}
|
||||||
// Update the LastIndexed time
|
// Update the LastIndexed time
|
||||||
idx.LastIndexed = time.Now()
|
idx.LastIndexed = time.Now()
|
||||||
idx.indexingTime = int(time.Since(startTime).Seconds())
|
idx.indexingTime = int(time.Since(startTime).Seconds())
|
||||||
|
@ -102,12 +105,20 @@ func (idx *Index) RunIndexing(origin string, quick bool) {
|
||||||
} else {
|
} else {
|
||||||
idx.assessment = "normal"
|
idx.assessment = "normal"
|
||||||
}
|
}
|
||||||
log.Printf("Index assessment : complexity=%v directories=%v files=%v \n", idx.assessment, idx.NumDirs, idx.NumFiles)
|
if firstRun {
|
||||||
|
logger.Info(fmt.Sprintf("Index assessment : complexity=%v directories=%v files=%v", idx.assessment, idx.NumDirs, idx.NumFiles))
|
||||||
|
} else {
|
||||||
|
logger.Debug(fmt.Sprintf("Index assessment : complexity=%v directories=%v files=%v", idx.assessment, idx.NumDirs, idx.NumFiles))
|
||||||
|
}
|
||||||
if idx.NumDirs != prevNumDirs || idx.NumFiles != prevNumFiles {
|
if idx.NumDirs != prevNumDirs || idx.NumFiles != prevNumFiles {
|
||||||
idx.FilesChangedDuringIndexing = true
|
idx.FilesChangedDuringIndexing = true
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
log.Printf("Time Spent Indexing : %v seconds\n", idx.indexingTime)
|
if firstRun {
|
||||||
|
logger.Info(fmt.Sprintf("Time spent indexing : %v seconds", idx.indexingTime))
|
||||||
|
} else {
|
||||||
|
logger.Debug(fmt.Sprintf("Time spent indexing : %v seconds", idx.indexingTime))
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
func (idx *Index) setupIndexingScanners() {
|
func (idx *Index) setupIndexingScanners() {
|
||||||
|
|
|
@ -6,6 +6,7 @@ import (
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/cache"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -23,18 +24,18 @@ type SearchResult struct {
|
||||||
func (idx *Index) Search(search string, scope string, sourceSession string) []SearchResult {
|
func (idx *Index) Search(search string, scope string, sourceSession string) []SearchResult {
|
||||||
// Remove slashes
|
// Remove slashes
|
||||||
scope = idx.makeIndexPath(scope)
|
scope = idx.makeIndexPath(scope)
|
||||||
runningHash := utils.GenerateRandomHash(4)
|
runningHash := utils.InsecureRandomIdentifier(4)
|
||||||
sessionInProgress.Store(sourceSession, runningHash) // Store the value in the sync.Map
|
sessionInProgress.Store(sourceSession, runningHash) // Store the value in the sync.Map
|
||||||
searchOptions := ParseSearch(search)
|
searchOptions := ParseSearch(search)
|
||||||
results := make(map[string]SearchResult, 0)
|
results := make(map[string]SearchResult, 0)
|
||||||
count := 0
|
count := 0
|
||||||
var directories []string
|
var directories []string
|
||||||
cachedDirs, ok := utils.SearchResultsCache.Get(idx.Source.Path + scope).([]string)
|
cachedDirs, ok := cache.SearchResults.Get(idx.Source.Path + scope).([]string)
|
||||||
if ok {
|
if ok {
|
||||||
directories = cachedDirs
|
directories = cachedDirs
|
||||||
} else {
|
} else {
|
||||||
directories = idx.getDirsInScope(scope)
|
directories = idx.getDirsInScope(scope)
|
||||||
utils.SearchResultsCache.Set(idx.Source.Path+scope, directories)
|
cache.SearchResults.Set(idx.Source.Path+scope, directories)
|
||||||
}
|
}
|
||||||
for _, searchTerm := range searchOptions.Terms {
|
for _, searchTerm := range searchOptions.Terms {
|
||||||
if searchTerm == "" {
|
if searchTerm == "" {
|
||||||
|
|
|
@ -7,13 +7,11 @@ require (
|
||||||
github.com/disintegration/imaging v1.6.2
|
github.com/disintegration/imaging v1.6.2
|
||||||
github.com/dsoprea/go-exif/v3 v3.0.1
|
github.com/dsoprea/go-exif/v3 v3.0.1
|
||||||
github.com/flynn/go-shlex v0.0.0-20150515145356-3f9db97f8568
|
github.com/flynn/go-shlex v0.0.0-20150515145356-3f9db97f8568
|
||||||
github.com/goccy/go-yaml v1.15.13
|
github.com/goccy/go-yaml v1.15.15
|
||||||
github.com/golang-jwt/jwt/v4 v4.5.1
|
github.com/golang-jwt/jwt/v4 v4.5.1
|
||||||
github.com/google/go-cmp v0.6.0
|
github.com/google/go-cmp v0.6.0
|
||||||
github.com/shirou/gopsutil/v3 v3.24.5
|
github.com/shirou/gopsutil/v3 v3.24.5
|
||||||
github.com/spf13/afero v1.11.0
|
github.com/spf13/afero v1.11.0
|
||||||
github.com/spf13/cobra v1.8.1
|
|
||||||
github.com/spf13/pflag v1.0.5
|
|
||||||
github.com/stretchr/testify v1.9.0
|
github.com/stretchr/testify v1.9.0
|
||||||
github.com/swaggo/http-swagger v1.3.4
|
github.com/swaggo/http-swagger v1.3.4
|
||||||
github.com/swaggo/swag v1.16.4
|
github.com/swaggo/swag v1.16.4
|
||||||
|
@ -35,7 +33,6 @@ require (
|
||||||
github.com/go-openapi/swag v0.23.0 // indirect
|
github.com/go-openapi/swag v0.23.0 // indirect
|
||||||
github.com/golang/geo v0.0.0-20230421003525-6adc56603217 // indirect
|
github.com/golang/geo v0.0.0-20230421003525-6adc56603217 // indirect
|
||||||
github.com/golang/snappy v0.0.4 // indirect
|
github.com/golang/snappy v0.0.4 // indirect
|
||||||
github.com/inconshreveable/mousetrap v1.1.0 // indirect
|
|
||||||
github.com/josharian/intern v1.0.0 // indirect
|
github.com/josharian/intern v1.0.0 // indirect
|
||||||
github.com/mailru/easyjson v0.9.0 // indirect
|
github.com/mailru/easyjson v0.9.0 // indirect
|
||||||
github.com/pmezard/go-difflib v1.0.0 // indirect
|
github.com/pmezard/go-difflib v1.0.0 // indirect
|
||||||
|
|
|
@ -6,7 +6,6 @@ github.com/Sereal/Sereal v0.0.0-20190618215532-0b8ac451a863 h1:BRrxwOZBolJN4gIwv
|
||||||
github.com/Sereal/Sereal v0.0.0-20190618215532-0b8ac451a863/go.mod h1:D0JMgToj/WdxCgd30Kc1UcA9E+WdZoJqeVOuYW7iTBM=
|
github.com/Sereal/Sereal v0.0.0-20190618215532-0b8ac451a863/go.mod h1:D0JMgToj/WdxCgd30Kc1UcA9E+WdZoJqeVOuYW7iTBM=
|
||||||
github.com/asdine/storm/v3 v3.2.1 h1:I5AqhkPK6nBZ/qJXySdI7ot5BlXSZ7qvDY1zAn5ZJac=
|
github.com/asdine/storm/v3 v3.2.1 h1:I5AqhkPK6nBZ/qJXySdI7ot5BlXSZ7qvDY1zAn5ZJac=
|
||||||
github.com/asdine/storm/v3 v3.2.1/go.mod h1:LEpXwGt4pIqrE/XcTvCnZHT5MgZCV6Ub9q7yQzOFWr0=
|
github.com/asdine/storm/v3 v3.2.1/go.mod h1:LEpXwGt4pIqrE/XcTvCnZHT5MgZCV6Ub9q7yQzOFWr0=
|
||||||
github.com/cpuguy83/go-md2man/v2 v2.0.4/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o=
|
|
||||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
github.com/disintegration/imaging v1.6.2 h1:w1LecBlG2Lnp8B3jk5zSuNqd7b4DXhcjwek1ei82L+c=
|
github.com/disintegration/imaging v1.6.2 h1:w1LecBlG2Lnp8B3jk5zSuNqd7b4DXhcjwek1ei82L+c=
|
||||||
|
@ -47,8 +46,8 @@ github.com/go-openapi/spec v0.21.0 h1:LTVzPc3p/RzRnkQqLRndbAzjY0d0BCL72A6j3CdL9Z
|
||||||
github.com/go-openapi/spec v0.21.0/go.mod h1:78u6VdPw81XU44qEWGhtr982gJ5BWg2c0I5XwVMotYk=
|
github.com/go-openapi/spec v0.21.0/go.mod h1:78u6VdPw81XU44qEWGhtr982gJ5BWg2c0I5XwVMotYk=
|
||||||
github.com/go-openapi/swag v0.23.0 h1:vsEVJDUo2hPJ2tu0/Xc+4noaxyEffXNIs3cOULZ+GrE=
|
github.com/go-openapi/swag v0.23.0 h1:vsEVJDUo2hPJ2tu0/Xc+4noaxyEffXNIs3cOULZ+GrE=
|
||||||
github.com/go-openapi/swag v0.23.0/go.mod h1:esZ8ITTYEsH1V2trKHjAN8Ai7xHb8RV+YSZ577vPjgQ=
|
github.com/go-openapi/swag v0.23.0/go.mod h1:esZ8ITTYEsH1V2trKHjAN8Ai7xHb8RV+YSZ577vPjgQ=
|
||||||
github.com/goccy/go-yaml v1.15.13 h1:Xd87Yddmr2rC1SLLTm2MNDcTjeO/GYo0JGiww6gSTDg=
|
github.com/goccy/go-yaml v1.15.15 h1:5turdzAlutS2Q7/QR/9R99Z1K0J00qDb4T0pHJcZ5ew=
|
||||||
github.com/goccy/go-yaml v1.15.13/go.mod h1:XBurs7gK8ATbW4ZPGKgcbrY1Br56PdM69F7LkFRi1kA=
|
github.com/goccy/go-yaml v1.15.15/go.mod h1:XBurs7gK8ATbW4ZPGKgcbrY1Br56PdM69F7LkFRi1kA=
|
||||||
github.com/golang-jwt/jwt/v4 v4.5.1 h1:JdqV9zKUdtaa9gdPlywC3aeoEsR681PlKC+4F5gQgeo=
|
github.com/golang-jwt/jwt/v4 v4.5.1 h1:JdqV9zKUdtaa9gdPlywC3aeoEsR681PlKC+4F5gQgeo=
|
||||||
github.com/golang-jwt/jwt/v4 v4.5.1/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=
|
github.com/golang-jwt/jwt/v4 v4.5.1/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=
|
||||||
github.com/golang/geo v0.0.0-20190916061304-5b978397cfec/go.mod h1:QZ0nwyI2jOfgRAoBvP+ab5aRr7c9x7lhGEJrKvBwjWI=
|
github.com/golang/geo v0.0.0-20190916061304-5b978397cfec/go.mod h1:QZ0nwyI2jOfgRAoBvP+ab5aRr7c9x7lhGEJrKvBwjWI=
|
||||||
|
@ -65,8 +64,6 @@ github.com/golang/snappy v0.0.4 h1:yAGX7huGHXlcLOEtBnF4w7FQwA26wojNCwOYAEhLjQM=
|
||||||
github.com/golang/snappy v0.0.4/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
|
github.com/golang/snappy v0.0.4/go.mod h1:/XxbfmMg8lxefKM7IXC3fBNl/7bRcc72aCRzEWrmP2Q=
|
||||||
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
|
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
|
||||||
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
|
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
|
||||||
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
|
|
||||||
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
|
|
||||||
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
|
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
|
||||||
github.com/jessevdk/go-flags v1.5.0/go.mod h1:Fw0T6WPc1dYxT4mKEZRfG5kJhaTDP9pj1c2EWnYs/m4=
|
github.com/jessevdk/go-flags v1.5.0/go.mod h1:Fw0T6WPc1dYxT4mKEZRfG5kJhaTDP9pj1c2EWnYs/m4=
|
||||||
github.com/josharian/intern v1.0.0 h1:vlS4z54oSdjm0bgjRigI+G1HpF+tI+9rE5LLzOg8HmY=
|
github.com/josharian/intern v1.0.0 h1:vlS4z54oSdjm0bgjRigI+G1HpF+tI+9rE5LLzOg8HmY=
|
||||||
|
@ -86,15 +83,10 @@ github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55 h1:o4JXh1EVt
|
||||||
github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55/go.mod h1:OmDBASR4679mdNQnz2pUhc2G8CO2JrUAVFDRBDP/hJE=
|
github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55/go.mod h1:OmDBASR4679mdNQnz2pUhc2G8CO2JrUAVFDRBDP/hJE=
|
||||||
github.com/rogpeppe/go-internal v1.11.0 h1:cWPaGQEPrBb5/AsnsZesgZZ9yb1OQ+GOISoDNXVBh4M=
|
github.com/rogpeppe/go-internal v1.11.0 h1:cWPaGQEPrBb5/AsnsZesgZZ9yb1OQ+GOISoDNXVBh4M=
|
||||||
github.com/rogpeppe/go-internal v1.11.0/go.mod h1:ddIwULY96R17DhadqLgMfk9H9tvdUzkipdSkR5nkCZA=
|
github.com/rogpeppe/go-internal v1.11.0/go.mod h1:ddIwULY96R17DhadqLgMfk9H9tvdUzkipdSkR5nkCZA=
|
||||||
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
|
|
||||||
github.com/shirou/gopsutil/v3 v3.24.5 h1:i0t8kL+kQTvpAYToeuiVk3TgDeKOFioZO3Ztz/iZ9pI=
|
github.com/shirou/gopsutil/v3 v3.24.5 h1:i0t8kL+kQTvpAYToeuiVk3TgDeKOFioZO3Ztz/iZ9pI=
|
||||||
github.com/shirou/gopsutil/v3 v3.24.5/go.mod h1:bsoOS1aStSs9ErQ1WWfxllSeS1K5D+U30r2NfcubMVk=
|
github.com/shirou/gopsutil/v3 v3.24.5/go.mod h1:bsoOS1aStSs9ErQ1WWfxllSeS1K5D+U30r2NfcubMVk=
|
||||||
github.com/spf13/afero v1.11.0 h1:WJQKhtpdm3v2IzqG8VMqrr6Rf3UYpEF239Jy9wNepM8=
|
github.com/spf13/afero v1.11.0 h1:WJQKhtpdm3v2IzqG8VMqrr6Rf3UYpEF239Jy9wNepM8=
|
||||||
github.com/spf13/afero v1.11.0/go.mod h1:GH9Y3pIexgf1MTIWtNGyogA5MwRIDXGUr+hbWNoBjkY=
|
github.com/spf13/afero v1.11.0/go.mod h1:GH9Y3pIexgf1MTIWtNGyogA5MwRIDXGUr+hbWNoBjkY=
|
||||||
github.com/spf13/cobra v1.8.1 h1:e5/vxKd/rZsfSJMUX1agtjeTDf+qv1/JdBF8gg5k9ZM=
|
|
||||||
github.com/spf13/cobra v1.8.1/go.mod h1:wHxEcudfqmLYa8iTfL+OuZPbBZkmvliBWKIezN3kD9Y=
|
|
||||||
github.com/spf13/pflag v1.0.5 h1:iy+VFUOCP1a+8yFto/drg2CJ5u0yRoB7fZw3DKv/JXA=
|
|
||||||
github.com/spf13/pflag v1.0.5/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
|
|
||||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||||
github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg=
|
github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg=
|
||||||
github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
|
github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
|
||||||
|
|
|
@ -4,7 +4,6 @@ import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
libError "errors"
|
libError "errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/url"
|
"net/url"
|
||||||
"os"
|
"os"
|
||||||
|
@ -18,6 +17,7 @@ import (
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/files"
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/share"
|
"github.com/gtsteffaniak/filebrowser/backend/share"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
"github.com/gtsteffaniak/filebrowser/backend/users"
|
||||||
|
@ -46,9 +46,18 @@ func extractToken(r *http.Request) (string, error) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
auth := r.URL.Query().Get("auth")
|
||||||
|
if auth != "" {
|
||||||
|
hasToken = true
|
||||||
|
if strings.Count(auth, ".") == 2 {
|
||||||
|
return auth, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Check for Authorization header
|
// Check for Authorization header
|
||||||
authHeader := r.Header.Get("Authorization")
|
authHeader := r.Header.Get("Authorization")
|
||||||
if authHeader != "" {
|
if authHeader != "" {
|
||||||
|
|
||||||
hasToken = true
|
hasToken = true
|
||||||
// Split the header to get "Bearer {token}"
|
// Split the header to get "Bearer {token}"
|
||||||
parts := strings.Split(authHeader, " ")
|
parts := strings.Split(authHeader, " ")
|
||||||
|
@ -58,14 +67,6 @@ func extractToken(r *http.Request) (string, error) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
auth := r.URL.Query().Get("auth")
|
|
||||||
if auth != "" {
|
|
||||||
hasToken = true
|
|
||||||
if strings.Count(auth, ".") == 2 {
|
|
||||||
return auth, nil
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if hasToken {
|
if hasToken {
|
||||||
return "", fmt.Errorf("invalid token provided")
|
return "", fmt.Errorf("invalid token provided")
|
||||||
}
|
}
|
||||||
|
@ -129,12 +130,12 @@ func signupHandler(w http.ResponseWriter, r *http.Request) {
|
||||||
|
|
||||||
userHome, err := config.MakeUserDir(user.Username, user.Scope, files.RootPaths["default"])
|
userHome, err := config.MakeUserDir(user.Username, user.Scope, files.RootPaths["default"])
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Printf("create user: failed to mkdir user home dir: [%s]", userHome)
|
logger.Error(fmt.Sprintf("create user: failed to mkdir user home dir: [%s]", userHome))
|
||||||
http.Error(w, http.StatusText(http.StatusInternalServerError), http.StatusInternalServerError)
|
http.Error(w, http.StatusText(http.StatusInternalServerError), http.StatusInternalServerError)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
user.Scope = userHome
|
user.Scope = userHome
|
||||||
log.Printf("new user: %s, home dir: [%s].", user.Username, userHome)
|
logger.Debug(fmt.Sprintf("new user: %s, home dir: [%s].", user.Username, userHome))
|
||||||
err = store.Users.Save(&user)
|
err = store.Users.Save(&user)
|
||||||
if err == errors.ErrExist {
|
if err == errors.ErrExist {
|
||||||
http.Error(w, http.StatusText(http.StatusConflict), http.StatusConflict)
|
http.Error(w, http.StatusText(http.StatusConflict), http.StatusConflict)
|
||||||
|
@ -151,7 +152,7 @@ func renewHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (in
|
||||||
}
|
}
|
||||||
|
|
||||||
func printToken(w http.ResponseWriter, _ *http.Request, user *users.User) (int, error) {
|
func printToken(w http.ResponseWriter, _ *http.Request, user *users.User) (int, error) {
|
||||||
signed, err := makeSignedTokenAPI(user, "WEB_TOKEN_"+utils.GenerateRandomHash(4), time.Hour*2, user.Perm)
|
signed, err := makeSignedTokenAPI(user, "WEB_TOKEN_"+utils.InsecureRandomIdentifier(4), time.Hour*2, user.Perm)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
if strings.Contains(err.Error(), "key already exists with same name") {
|
if strings.Contains(err.Error(), "key already exists with same name") {
|
||||||
return http.StatusConflict, err
|
return http.StatusConflict, err
|
||||||
|
|
|
@ -4,7 +4,6 @@ import (
|
||||||
"compress/gzip"
|
"compress/gzip"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
"net/http"
|
"net/http"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"strings"
|
"strings"
|
||||||
|
@ -12,15 +11,16 @@ import (
|
||||||
|
|
||||||
"github.com/golang-jwt/jwt/v4"
|
"github.com/golang-jwt/jwt/v4"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/files"
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/runner"
|
"github.com/gtsteffaniak/filebrowser/backend/runner"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
"github.com/gtsteffaniak/filebrowser/backend/users"
|
||||||
)
|
)
|
||||||
|
|
||||||
type requestContext struct {
|
type requestContext struct {
|
||||||
user *users.User
|
user *users.User
|
||||||
*runner.Runner
|
*runner.Runner
|
||||||
raw interface{}
|
raw interface{}
|
||||||
|
token string
|
||||||
}
|
}
|
||||||
|
|
||||||
type HttpResponse struct {
|
type HttpResponse struct {
|
||||||
|
@ -94,7 +94,7 @@ func withAdminHelper(fn handleFunc) handleFunc {
|
||||||
// Middleware to retrieve and authenticate user
|
// Middleware to retrieve and authenticate user
|
||||||
func withUserHelper(fn handleFunc) handleFunc {
|
func withUserHelper(fn handleFunc) handleFunc {
|
||||||
return func(w http.ResponseWriter, r *http.Request, data *requestContext) (int, error) {
|
return func(w http.ResponseWriter, r *http.Request, data *requestContext) (int, error) {
|
||||||
if settings.Config.Auth.Method == "noauth" {
|
if config.Auth.Method == "noauth" {
|
||||||
var err error
|
var err error
|
||||||
// Retrieve the user from the store and store it in the context
|
// Retrieve the user from the store and store it in the context
|
||||||
data.user, err = store.Users.Get(files.RootPaths["default"], "admin")
|
data.user, err = store.Users.Get(files.RootPaths["default"], "admin")
|
||||||
|
@ -108,8 +108,10 @@ func withUserHelper(fn handleFunc) handleFunc {
|
||||||
}
|
}
|
||||||
tokenString, err := extractToken(r)
|
tokenString, err := extractToken(r)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
|
logger.Debug(fmt.Sprintf("error extracting from request %v", err))
|
||||||
return http.StatusUnauthorized, err
|
return http.StatusUnauthorized, err
|
||||||
}
|
}
|
||||||
|
data.token = tokenString
|
||||||
|
|
||||||
var tk users.AuthToken
|
var tk users.AuthToken
|
||||||
token, err := jwt.ParseWithClaims(tokenString, &tk, keyFunc)
|
token, err := jwt.ParseWithClaims(tokenString, &tk, keyFunc)
|
||||||
|
@ -126,11 +128,15 @@ func withUserHelper(fn handleFunc) handleFunc {
|
||||||
if tk.Expires < time.Now().Add(time.Hour).Unix() {
|
if tk.Expires < time.Now().Add(time.Hour).Unix() {
|
||||||
w.Header().Add("X-Renew-Token", "true")
|
w.Header().Add("X-Renew-Token", "true")
|
||||||
}
|
}
|
||||||
|
|
||||||
// Retrieve the user from the store and store it in the context
|
// Retrieve the user from the store and store it in the context
|
||||||
data.user, err = store.Users.Get(files.RootPaths["default"], tk.BelongsTo)
|
data.user, err = store.Users.Get(files.RootPaths["default"], tk.BelongsTo)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return http.StatusInternalServerError, err
|
return http.StatusInternalServerError, err
|
||||||
}
|
}
|
||||||
|
|
||||||
|
setUserInResponseWriter(w, data.user)
|
||||||
|
|
||||||
// Call the handler function, passing in the context
|
// Call the handler function, passing in the context
|
||||||
return fn(w, r, data)
|
return fn(w, r, data)
|
||||||
}
|
}
|
||||||
|
@ -172,14 +178,14 @@ func wrapHandler(fn handleFunc) http.HandlerFunc {
|
||||||
// Marshal the error response to JSON
|
// Marshal the error response to JSON
|
||||||
errorBytes, marshalErr := json.Marshal(response)
|
errorBytes, marshalErr := json.Marshal(response)
|
||||||
if marshalErr != nil {
|
if marshalErr != nil {
|
||||||
log.Printf("Error marshalling error response: %v", marshalErr)
|
logger.Error(fmt.Sprintf("Error marshalling error response: %v", marshalErr))
|
||||||
http.Error(w, "Internal Server Error", http.StatusInternalServerError)
|
http.Error(w, "Internal Server Error", http.StatusInternalServerError)
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
// Write the JSON error response
|
// Write the JSON error response
|
||||||
if _, writeErr := w.Write(errorBytes); writeErr != nil {
|
if _, writeErr := w.Write(errorBytes); writeErr != nil {
|
||||||
log.Printf("Error writing error response: %v", writeErr)
|
logger.Error(fmt.Sprintf("Error writing error response: %v", writeErr))
|
||||||
}
|
}
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
@ -233,6 +239,7 @@ type ResponseWriterWrapper struct {
|
||||||
StatusCode int
|
StatusCode int
|
||||||
wroteHeader bool
|
wroteHeader bool
|
||||||
PayloadSize int
|
PayloadSize int
|
||||||
|
User string
|
||||||
}
|
}
|
||||||
|
|
||||||
// WriteHeader captures the status code and ensures it's only written once
|
// WriteHeader captures the status code and ensures it's only written once
|
||||||
|
@ -255,6 +262,16 @@ func (w *ResponseWriterWrapper) Write(b []byte) (int, error) {
|
||||||
return w.ResponseWriter.Write(b)
|
return w.ResponseWriter.Write(b)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Helper function to set the user in the ResponseWriterWrapper
|
||||||
|
func setUserInResponseWriter(w http.ResponseWriter, user *users.User) {
|
||||||
|
// Wrap the response writer to set the user field
|
||||||
|
if wrappedWriter, ok := w.(*ResponseWriterWrapper); ok {
|
||||||
|
if user != nil {
|
||||||
|
wrappedWriter.User = user.Username
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// LoggingMiddleware logs each request and its status code.
|
// LoggingMiddleware logs each request and its status code.
|
||||||
func LoggingMiddleware(next http.Handler) http.Handler {
|
func LoggingMiddleware(next http.Handler) http.Handler {
|
||||||
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
@ -267,30 +284,23 @@ func LoggingMiddleware(next http.Handler) http.Handler {
|
||||||
// Call the next handler.
|
// Call the next handler.
|
||||||
next.ServeHTTP(wrappedWriter, r)
|
next.ServeHTTP(wrappedWriter, r)
|
||||||
|
|
||||||
// Determine the color based on the status code.
|
|
||||||
color := "\033[32m" // Default green color
|
|
||||||
if wrappedWriter.StatusCode >= 300 && wrappedWriter.StatusCode < 500 {
|
|
||||||
color = "\033[33m" // Yellow for client errors (4xx)
|
|
||||||
} else if wrappedWriter.StatusCode >= 500 {
|
|
||||||
color = "\033[31m" // Red for server errors (5xx)
|
|
||||||
}
|
|
||||||
|
|
||||||
// Capture the full URL path including the query parameters.
|
// Capture the full URL path including the query parameters.
|
||||||
fullURL := r.URL.Path
|
fullURL := r.URL.Path
|
||||||
if r.URL.RawQuery != "" {
|
if r.URL.RawQuery != "" {
|
||||||
fullURL += "?" + r.URL.RawQuery
|
fullURL += "?" + r.URL.RawQuery
|
||||||
}
|
}
|
||||||
|
truncUser := wrappedWriter.User
|
||||||
// Log the request, status code, and response size.
|
if len(truncUser) > 12 {
|
||||||
log.Printf("%s%-7s | %3d | %-15s | %-12s | \"%s\"%s",
|
truncUser = truncUser[:10] + ".."
|
||||||
color,
|
}
|
||||||
r.Method,
|
logger.Api(
|
||||||
wrappedWriter.StatusCode, // Captured status code
|
fmt.Sprintf("%-7s | %3d | %-15s | %-12s | %-12s | \"%s\"",
|
||||||
r.RemoteAddr,
|
r.Method,
|
||||||
time.Since(start).String(),
|
wrappedWriter.StatusCode, // Captured status code
|
||||||
fullURL,
|
r.RemoteAddr,
|
||||||
"\033[0m", // Reset color
|
truncUser,
|
||||||
)
|
time.Since(start).String(),
|
||||||
|
fullURL), wrappedWriter.StatusCode)
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -105,7 +105,7 @@ func TestWithAdminHelper(t *testing.T) {
|
||||||
data := &requestContext{
|
data := &requestContext{
|
||||||
user: tc.user,
|
user: tc.user,
|
||||||
}
|
}
|
||||||
token, err := makeSignedTokenAPI(tc.user, "WEB_TOKEN_"+utils.GenerateRandomHash(4), time.Hour*2, tc.user.Perm)
|
token, err := makeSignedTokenAPI(tc.user, "WEB_TOKEN_"+utils.InsecureRandomIdentifier(4), time.Hour*2, tc.user.Perm)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
t.Fatalf("Error making token for request: %v", err)
|
t.Fatalf("Error making token for request: %v", err)
|
||||||
}
|
}
|
||||||
|
|
|
@ -0,0 +1,209 @@
|
||||||
|
package http
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"net/url"
|
||||||
|
"path/filepath"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"github.com/golang-jwt/jwt/v4"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/cache"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
|
)
|
||||||
|
|
||||||
|
const (
|
||||||
|
onlyOfficeStatusDocumentClosedWithChanges = 2
|
||||||
|
onlyOfficeStatusDocumentClosedWithNoChanges = 4
|
||||||
|
onlyOfficeStatusForceSaveWhileDocumentStillOpen = 6
|
||||||
|
)
|
||||||
|
|
||||||
|
type OnlyOfficeCallback struct {
|
||||||
|
ChangesURL string `json:"changesurl,omitempty"`
|
||||||
|
Key string `json:"key,omitempty"`
|
||||||
|
Status int `json:"status,omitempty"`
|
||||||
|
URL string `json:"url,omitempty"`
|
||||||
|
Users []string `json:"users,omitempty"`
|
||||||
|
UserData string `json:"userdata,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func onlyofficeClientConfigGetHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
||||||
|
if settings.Config.Integrations.OnlyOffice.Url == "" {
|
||||||
|
return http.StatusInternalServerError, errors.New("only-office integration must be configured in settings")
|
||||||
|
}
|
||||||
|
|
||||||
|
if !d.user.Perm.Modify {
|
||||||
|
return http.StatusForbidden, nil
|
||||||
|
}
|
||||||
|
encodedUrl := r.URL.Query().Get("url")
|
||||||
|
source := r.URL.Query().Get("source")
|
||||||
|
if source == "" {
|
||||||
|
source = "default"
|
||||||
|
}
|
||||||
|
// Decode the URL-encoded path
|
||||||
|
url, err := url.QueryUnescape(encodedUrl)
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusBadRequest, fmt.Errorf("invalid path encoding: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// get path from url
|
||||||
|
pathParts := strings.Split(url, "/api/raw?files=/")
|
||||||
|
path := pathParts[len(pathParts)-1]
|
||||||
|
urlFirst := pathParts[0]
|
||||||
|
if settings.Config.Server.InternalUrl != "" {
|
||||||
|
urlFirst = settings.Config.Server.InternalUrl
|
||||||
|
replacement := strings.Split(url, "/api/raw")[0]
|
||||||
|
url = strings.Replace(url, replacement, settings.Config.Server.InternalUrl, 1)
|
||||||
|
}
|
||||||
|
fileInfo, err := files.FileInfoFaster(files.FileOptions{
|
||||||
|
Path: filepath.Join(d.user.Scope, path),
|
||||||
|
Modify: d.user.Perm.Modify,
|
||||||
|
Source: source,
|
||||||
|
Expand: false,
|
||||||
|
ReadHeader: config.Server.TypeDetectionByHeader,
|
||||||
|
Checker: d.user,
|
||||||
|
})
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return errToStatus(err), err
|
||||||
|
}
|
||||||
|
|
||||||
|
id, err := getOnlyOfficeId(source, fileInfo.Path)
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusNotFound, err
|
||||||
|
}
|
||||||
|
split := strings.Split(fileInfo.Name, ".")
|
||||||
|
fileType := split[len(split)-1]
|
||||||
|
|
||||||
|
theme := "light"
|
||||||
|
if d.user.DarkMode {
|
||||||
|
theme = "dark"
|
||||||
|
}
|
||||||
|
|
||||||
|
clientConfig := map[string]interface{}{
|
||||||
|
"document": map[string]interface{}{
|
||||||
|
"fileType": fileType,
|
||||||
|
"key": id,
|
||||||
|
"title": fileInfo.Name,
|
||||||
|
"url": url + "&auth=" + d.token,
|
||||||
|
"permissions": map[string]interface{}{
|
||||||
|
"edit": d.user.Perm.Modify,
|
||||||
|
"download": d.user.Perm.Download,
|
||||||
|
"print": d.user.Perm.Download,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"editorConfig": map[string]interface{}{
|
||||||
|
"callbackUrl": fmt.Sprintf("%v/api/onlyoffice/callback?path=%v&auth=%v", urlFirst, path, d.token),
|
||||||
|
"user": map[string]interface{}{
|
||||||
|
"id": strconv.FormatUint(uint64(d.user.ID), 10),
|
||||||
|
"name": d.user.Username,
|
||||||
|
},
|
||||||
|
"customization": map[string]interface{}{
|
||||||
|
"autosave": true,
|
||||||
|
"forcesave": true,
|
||||||
|
"uiTheme": theme,
|
||||||
|
},
|
||||||
|
"lang": d.user.Locale,
|
||||||
|
"mode": "edit",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
if settings.Config.Integrations.OnlyOffice.Secret != "" {
|
||||||
|
token := jwt.NewWithClaims(jwt.SigningMethodHS256, jwt.MapClaims(clientConfig))
|
||||||
|
signature, err := token.SignedString([]byte(settings.Config.Integrations.OnlyOffice.Secret))
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusInternalServerError, fmt.Errorf("failed to sign JWT")
|
||||||
|
}
|
||||||
|
clientConfig["token"] = signature
|
||||||
|
}
|
||||||
|
return renderJSON(w, r, clientConfig)
|
||||||
|
}
|
||||||
|
|
||||||
|
func onlyofficeCallbackHandler(w http.ResponseWriter, r *http.Request, d *requestContext) (int, error) {
|
||||||
|
body, err := io.ReadAll(r.Body)
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusInternalServerError, err
|
||||||
|
}
|
||||||
|
|
||||||
|
var data OnlyOfficeCallback
|
||||||
|
err = json.Unmarshal(body, &data)
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusInternalServerError, err
|
||||||
|
}
|
||||||
|
|
||||||
|
encodedPath := r.URL.Query().Get("path")
|
||||||
|
source := r.URL.Query().Get("source")
|
||||||
|
if source == "" {
|
||||||
|
source = "default"
|
||||||
|
}
|
||||||
|
// Decode the URL-encoded path
|
||||||
|
path, err := url.QueryUnescape(encodedPath)
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusBadRequest, fmt.Errorf("invalid path encoding: %v", err)
|
||||||
|
}
|
||||||
|
if data.Status == onlyOfficeStatusDocumentClosedWithChanges ||
|
||||||
|
data.Status == onlyOfficeStatusDocumentClosedWithNoChanges {
|
||||||
|
// Refer to only-office documentation
|
||||||
|
// - https://api.onlyoffice.com/editors/coedit
|
||||||
|
// - https://api.onlyoffice.com/editors/callback
|
||||||
|
//
|
||||||
|
// When the document is fully closed by all editors,
|
||||||
|
// then the document key should no longer be re-used.
|
||||||
|
deleteOfficeId(source, path)
|
||||||
|
}
|
||||||
|
|
||||||
|
if data.Status == onlyOfficeStatusDocumentClosedWithChanges ||
|
||||||
|
data.Status == onlyOfficeStatusForceSaveWhileDocumentStillOpen {
|
||||||
|
if !d.user.Perm.Modify {
|
||||||
|
return http.StatusForbidden, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
doc, err := http.Get(data.URL)
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusInternalServerError, err
|
||||||
|
}
|
||||||
|
defer doc.Body.Close()
|
||||||
|
|
||||||
|
err = d.Runner.RunHook(func() error {
|
||||||
|
fileOpts := files.FileOptions{
|
||||||
|
Path: path,
|
||||||
|
Source: source,
|
||||||
|
}
|
||||||
|
writeErr := files.WriteFile(fileOpts, doc.Body)
|
||||||
|
if writeErr != nil {
|
||||||
|
return writeErr
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}, "save", path, "", d.user)
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return http.StatusInternalServerError, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
resp := map[string]int{
|
||||||
|
"error": 0,
|
||||||
|
}
|
||||||
|
return renderJSON(w, r, resp)
|
||||||
|
}
|
||||||
|
|
||||||
|
func getOnlyOfficeId(source, path string) (string, error) {
|
||||||
|
idx := files.GetIndex(source)
|
||||||
|
realpath, _, _ := idx.GetRealPath(path)
|
||||||
|
// error is intentionally ignored in order treat errors
|
||||||
|
// the same as a cache-miss
|
||||||
|
cachedDocumentKey, ok := cache.OnlyOffice.Get(realpath).(string)
|
||||||
|
if ok {
|
||||||
|
return cachedDocumentKey, nil
|
||||||
|
}
|
||||||
|
return "", fmt.Errorf("document key not found")
|
||||||
|
}
|
||||||
|
func deleteOfficeId(source, path string) {
|
||||||
|
idx := files.GetIndex(source)
|
||||||
|
realpath, _, _ := idx.GetRealPath(path)
|
||||||
|
cache.OnlyOffice.Delete(realpath)
|
||||||
|
}
|
|
@ -13,6 +13,7 @@ import (
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/files"
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/img"
|
"github.com/gtsteffaniak/filebrowser/backend/img"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
)
|
)
|
||||||
|
|
||||||
type ImgService interface {
|
type ImgService interface {
|
||||||
|
@ -141,7 +142,7 @@ func createPreview(imgSvc ImgService, fileCache FileCache, file files.ExtendedFi
|
||||||
go func() {
|
go func() {
|
||||||
cacheKey := previewCacheKey(file.RealPath, previewSize, file.FileInfo.ModTime)
|
cacheKey := previewCacheKey(file.RealPath, previewSize, file.FileInfo.ModTime)
|
||||||
if err := fileCache.Store(context.Background(), cacheKey, buf.Bytes()); err != nil {
|
if err := fileCache.Store(context.Background(), cacheKey, buf.Bytes()); err != nil {
|
||||||
fmt.Printf("failed to cache resized image: %v", err)
|
logger.Error(fmt.Sprintf("failed to cache resized image: %v", err))
|
||||||
}
|
}
|
||||||
}()
|
}()
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,6 @@ import (
|
||||||
"errors"
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
"log"
|
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/url"
|
"net/url"
|
||||||
"os"
|
"os"
|
||||||
|
@ -15,6 +14,7 @@ import (
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/files"
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
)
|
)
|
||||||
|
|
||||||
func setContentDisposition(w http.ResponseWriter, r *http.Request, fileName string) {
|
func setContentDisposition(w http.ResponseWriter, r *http.Request, fileName string) {
|
||||||
|
@ -211,13 +211,14 @@ func rawFilesHandler(w http.ResponseWriter, r *http.Request, d *requestContext,
|
||||||
default:
|
default:
|
||||||
return http.StatusInternalServerError, errors.New("format not implemented")
|
return http.StatusInternalServerError, errors.New("format not implemented")
|
||||||
}
|
}
|
||||||
|
|
||||||
baseDirName := filepath.Base(filepath.Dir(realPath))
|
baseDirName := filepath.Base(filepath.Dir(realPath))
|
||||||
if baseDirName == "" || baseDirName == "/" {
|
if baseDirName == "" || baseDirName == "/" {
|
||||||
baseDirName = "download"
|
baseDirName = "download"
|
||||||
}
|
}
|
||||||
|
if len(fileList) == 1 && isDir {
|
||||||
|
baseDirName = filepath.Base(realPath)
|
||||||
|
}
|
||||||
downloadFileName := url.PathEscape(baseDirName + extension)
|
downloadFileName := url.PathEscape(baseDirName + extension)
|
||||||
|
|
||||||
w.Header().Set("Content-Disposition", "attachment; filename*=utf-8''"+downloadFileName)
|
w.Header().Set("Content-Disposition", "attachment; filename*=utf-8''"+downloadFileName)
|
||||||
// Create the archive and stream it directly to the response
|
// Create the archive and stream it directly to the response
|
||||||
if extension == ".zip" {
|
if extension == ".zip" {
|
||||||
|
@ -242,7 +243,7 @@ func createZip(w io.Writer, d *requestContext, filenames ...string) error {
|
||||||
for _, fname := range filenames {
|
for _, fname := range filenames {
|
||||||
err := addFile(fname, d, nil, zipWriter, false)
|
err := addFile(fname, d, nil, zipWriter, false)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Printf("Failed to add %s to ZIP: %v", fname, err)
|
logger.Error(fmt.Sprintf("Failed to add %s to ZIP: %v", fname, err))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -261,7 +262,7 @@ func createTarGz(w io.Writer, d *requestContext, filenames ...string) error {
|
||||||
for _, fname := range filenames {
|
for _, fname := range filenames {
|
||||||
err := addFile(fname, d, tarWriter, nil, false)
|
err := addFile(fname, d, tarWriter, nil, false)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Printf("Failed to add %s to TAR.GZ: %v", fname, err)
|
logger.Error(fmt.Sprintf("Failed to add %s to TAR.GZ: %v", fname, err))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -13,9 +13,9 @@ import (
|
||||||
|
|
||||||
"github.com/shirou/gopsutil/v3/disk"
|
"github.com/shirou/gopsutil/v3/disk"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/cache"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/files"
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/utils"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
// resourceGetHandler retrieves information about a resource.
|
// resourceGetHandler retrieves information about a resource.
|
||||||
|
@ -397,7 +397,7 @@ func diskUsage(w http.ResponseWriter, r *http.Request, d *requestContext) (int,
|
||||||
if source == "" {
|
if source == "" {
|
||||||
source = "default"
|
source = "default"
|
||||||
}
|
}
|
||||||
value, ok := utils.DiskUsageCache.Get(source).(DiskUsageResponse)
|
value, ok := cache.DiskUsage.Get(source).(DiskUsageResponse)
|
||||||
if ok {
|
if ok {
|
||||||
return renderJSON(w, r, &value)
|
return renderJSON(w, r, &value)
|
||||||
}
|
}
|
||||||
|
@ -415,7 +415,7 @@ func diskUsage(w http.ResponseWriter, r *http.Request, d *requestContext) (int,
|
||||||
Total: usage.Total,
|
Total: usage.Total,
|
||||||
Used: usage.Used,
|
Used: usage.Used,
|
||||||
}
|
}
|
||||||
utils.DiskUsageCache.Set(source, latestUsage)
|
cache.DiskUsage.Set(source, latestUsage)
|
||||||
return renderJSON(w, r, &latestUsage)
|
return renderJSON(w, r, &latestUsage)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -5,11 +5,11 @@ import (
|
||||||
"embed"
|
"embed"
|
||||||
"fmt"
|
"fmt"
|
||||||
"io/fs"
|
"io/fs"
|
||||||
"log"
|
|
||||||
"net/http"
|
"net/http"
|
||||||
"os"
|
"os"
|
||||||
"text/template"
|
"text/template"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
"github.com/gtsteffaniak/filebrowser/backend/storage"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/version"
|
"github.com/gtsteffaniak/filebrowser/backend/version"
|
||||||
|
@ -56,7 +56,7 @@ func StartHttp(Service ImgService, storage *storage.Storage, cache FileCache) {
|
||||||
// Embedded mode: Serve files from the embedded assets
|
// Embedded mode: Serve files from the embedded assets
|
||||||
assetFs, err = fs.Sub(assets, "embed")
|
assetFs, err = fs.Sub(assets, "embed")
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatal("Could not embed frontend. Does dist exist?")
|
logger.Fatal("Could not embed frontend. Does dist exist?")
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
assetFs = dirFS{Dir: http.Dir("http/dist")}
|
assetFs = dirFS{Dir: http.Dir("http/dist")}
|
||||||
|
@ -114,6 +114,9 @@ func StartHttp(Service ImgService, storage *storage.Storage, cache FileCache) {
|
||||||
api.HandleFunc("GET /settings", withAdmin(settingsGetHandler))
|
api.HandleFunc("GET /settings", withAdmin(settingsGetHandler))
|
||||||
api.HandleFunc("PUT /settings", withAdmin(settingsPutHandler))
|
api.HandleFunc("PUT /settings", withAdmin(settingsPutHandler))
|
||||||
|
|
||||||
|
api.HandleFunc("GET /onlyoffice/config", withUser(onlyofficeClientConfigGetHandler))
|
||||||
|
api.HandleFunc("POST /onlyoffice/callback", withUser(onlyofficeCallbackHandler))
|
||||||
|
|
||||||
api.HandleFunc("GET /search", withUser(searchHandler))
|
api.HandleFunc("GET /search", withUser(searchHandler))
|
||||||
apiPath := config.Server.BaseURL + "api"
|
apiPath := config.Server.BaseURL + "api"
|
||||||
router.Handle(apiPath+"/", http.StripPrefix(apiPath, api))
|
router.Handle(apiPath+"/", http.StripPrefix(apiPath, api))
|
||||||
|
@ -143,7 +146,7 @@ func StartHttp(Service ImgService, storage *storage.Storage, cache FileCache) {
|
||||||
// Load the TLS certificate and key
|
// Load the TLS certificate and key
|
||||||
cer, err := tls.LoadX509KeyPair(config.Server.TLSCert, config.Server.TLSKey)
|
cer, err := tls.LoadX509KeyPair(config.Server.TLSCert, config.Server.TLSKey)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("could not load certificate: %v", err)
|
logger.Fatal(fmt.Sprintf("could not load certificate: %v", err))
|
||||||
}
|
}
|
||||||
|
|
||||||
// Create a custom TLS listener
|
// Create a custom TLS listener
|
||||||
|
@ -158,17 +161,17 @@ func StartHttp(Service ImgService, storage *storage.Storage, cache FileCache) {
|
||||||
// Listen on TCP and wrap with TLS
|
// Listen on TCP and wrap with TLS
|
||||||
listener, err := tls.Listen("tcp", fmt.Sprintf(":%v", config.Server.Port), tlsConfig)
|
listener, err := tls.Listen("tcp", fmt.Sprintf(":%v", config.Server.Port), tlsConfig)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("could not start TLS server: %v", err)
|
logger.Fatal(fmt.Sprintf("could not start TLS server: %v", err))
|
||||||
}
|
}
|
||||||
if config.Server.Port != 443 {
|
if config.Server.Port != 443 {
|
||||||
port = fmt.Sprintf(":%d", config.Server.Port)
|
port = fmt.Sprintf(":%d", config.Server.Port)
|
||||||
}
|
}
|
||||||
// Build the full URL with host and port
|
// Build the full URL with host and port
|
||||||
fullURL := fmt.Sprintf("%s://localhost%s%s", scheme, port, config.Server.BaseURL)
|
fullURL := fmt.Sprintf("%s://localhost%s%s", scheme, port, config.Server.BaseURL)
|
||||||
log.Printf("Running at : %s", fullURL)
|
logger.Info(fmt.Sprintf("Running at : %s", fullURL))
|
||||||
err = http.Serve(listener, muxWithMiddleware(router))
|
err = http.Serve(listener, muxWithMiddleware(router))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("could not start server: %v", err)
|
logger.Fatal(fmt.Sprintf("could not start server: %v", err))
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// Set HTTP scheme and the default port for HTTP
|
// Set HTTP scheme and the default port for HTTP
|
||||||
|
@ -178,10 +181,10 @@ func StartHttp(Service ImgService, storage *storage.Storage, cache FileCache) {
|
||||||
}
|
}
|
||||||
// Build the full URL with host and port
|
// Build the full URL with host and port
|
||||||
fullURL := fmt.Sprintf("%s://localhost%s%s", scheme, port, config.Server.BaseURL)
|
fullURL := fmt.Sprintf("%s://localhost%s%s", scheme, port, config.Server.BaseURL)
|
||||||
log.Printf("Running at : %s", fullURL)
|
logger.Info(fmt.Sprintf("Running at : %s", fullURL))
|
||||||
err := http.ListenAndServe(fmt.Sprintf(":%v", config.Server.Port), muxWithMiddleware(router))
|
err := http.ListenAndServe(fmt.Sprintf(":%v", config.Server.Port), muxWithMiddleware(router))
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("could not start server: %v", err)
|
logger.Fatal(fmt.Sprintf("could not start server: %v", err))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -4,7 +4,6 @@ import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
"io/fs"
|
"io/fs"
|
||||||
"log"
|
|
||||||
"net/http"
|
"net/http"
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
@ -12,6 +11,7 @@ import (
|
||||||
"text/template"
|
"text/template"
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/auth"
|
"github.com/gtsteffaniak/filebrowser/backend/auth"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/version"
|
"github.com/gtsteffaniak/filebrowser/backend/version"
|
||||||
)
|
)
|
||||||
|
@ -64,6 +64,7 @@ func handleWithStaticData(w http.ResponseWriter, r *http.Request, file, contentT
|
||||||
"ReCaptchaHost": config.Auth.Recaptcha.Host,
|
"ReCaptchaHost": config.Auth.Recaptcha.Host,
|
||||||
"ExternalLinks": config.Frontend.ExternalLinks,
|
"ExternalLinks": config.Frontend.ExternalLinks,
|
||||||
"ExternalUrl": strings.TrimSuffix(config.Server.ExternalUrl, "/"),
|
"ExternalUrl": strings.TrimSuffix(config.Server.ExternalUrl, "/"),
|
||||||
|
"OnlyOfficeUrl": settings.Config.Integrations.OnlyOffice.Url,
|
||||||
}
|
}
|
||||||
|
|
||||||
if config.Frontend.Files != "" {
|
if config.Frontend.Files != "" {
|
||||||
|
@ -71,7 +72,7 @@ func handleWithStaticData(w http.ResponseWriter, r *http.Request, file, contentT
|
||||||
_, err := os.Stat(fPath) //nolint:govet
|
_, err := os.Stat(fPath) //nolint:govet
|
||||||
|
|
||||||
if err != nil && !os.IsNotExist(err) {
|
if err != nil && !os.IsNotExist(err) {
|
||||||
log.Printf("couldn't load custom styles: %v", err)
|
logger.Error(fmt.Sprintf("couldn't load custom styles: %v", err))
|
||||||
}
|
}
|
||||||
|
|
||||||
if err == nil {
|
if err == nil {
|
||||||
|
|
|
@ -0,0 +1,132 @@
|
||||||
|
package logger
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"log"
|
||||||
|
"os"
|
||||||
|
"slices"
|
||||||
|
"strings"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Logger wraps the standard log.Logger with log level functionality
|
||||||
|
type Logger struct {
|
||||||
|
logger *log.Logger
|
||||||
|
levels []LogLevel
|
||||||
|
apiLevels []LogLevel
|
||||||
|
stdout bool
|
||||||
|
disabled bool
|
||||||
|
disabledAPI bool
|
||||||
|
colors bool
|
||||||
|
}
|
||||||
|
|
||||||
|
var stdOutLoggerExists bool
|
||||||
|
|
||||||
|
// NewLogger creates a new Logger instance with separate file and stdout loggers
|
||||||
|
func NewLogger(filepath string, levels, apiLevels []LogLevel, noColors bool) (*Logger, error) {
|
||||||
|
var fileWriter io.Writer = io.Discard
|
||||||
|
stdout := filepath == ""
|
||||||
|
// Configure file logging
|
||||||
|
if !stdout {
|
||||||
|
file, err := os.OpenFile(filepath, os.O_CREATE|os.O_WRONLY|os.O_APPEND, 0666)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to open log file: %v", err)
|
||||||
|
}
|
||||||
|
fileWriter = file
|
||||||
|
}
|
||||||
|
flags := log.Ldate | log.Ltime
|
||||||
|
if slices.Contains(levels, DEBUG) {
|
||||||
|
flags |= log.Lshortfile
|
||||||
|
}
|
||||||
|
logger := log.New(os.Stdout, "", flags)
|
||||||
|
if filepath != "" {
|
||||||
|
logger = log.New(fileWriter, "", flags)
|
||||||
|
}
|
||||||
|
if stdout {
|
||||||
|
stdOutLoggerExists = true
|
||||||
|
}
|
||||||
|
return &Logger{
|
||||||
|
logger: logger,
|
||||||
|
levels: levels,
|
||||||
|
apiLevels: apiLevels,
|
||||||
|
disabled: slices.Contains(levels, DISABLED),
|
||||||
|
disabledAPI: slices.Contains(apiLevels, DISABLED),
|
||||||
|
colors: !noColors,
|
||||||
|
stdout: stdout,
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetupLogger configures the logger with file and stdout options and their respective log levels
|
||||||
|
func SetupLogger(output, levels, apiLevels string, noColors bool) error {
|
||||||
|
upperLevels := []LogLevel{}
|
||||||
|
for _, level := range SplitByMultiple(levels) {
|
||||||
|
if level == "" {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
upperLevel := strings.ToUpper(level)
|
||||||
|
if upperLevel == "WARNING" || upperLevel == "WARN" {
|
||||||
|
upperLevel = "WARN "
|
||||||
|
}
|
||||||
|
// Convert level strings to LogLevel
|
||||||
|
level, ok := stringToLevel[upperLevel]
|
||||||
|
if !ok {
|
||||||
|
loggers = []*Logger{}
|
||||||
|
return fmt.Errorf("invalid file log level: %s", upperLevel)
|
||||||
|
}
|
||||||
|
upperLevels = append(upperLevels, level)
|
||||||
|
}
|
||||||
|
if len(upperLevels) == 0 {
|
||||||
|
upperLevels = []LogLevel{INFO, ERROR, WARNING}
|
||||||
|
}
|
||||||
|
upperApiLevels := []LogLevel{}
|
||||||
|
for _, level := range SplitByMultiple(apiLevels) {
|
||||||
|
if level == "" {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
upperLevel := strings.ToUpper(level)
|
||||||
|
if upperLevel == "WARNING" || upperLevel == "WARN" {
|
||||||
|
upperLevel = "WARN "
|
||||||
|
}
|
||||||
|
// Convert level strings to LogLevel
|
||||||
|
level, ok := stringToLevel[strings.ToUpper(upperLevel)]
|
||||||
|
if !ok {
|
||||||
|
return fmt.Errorf("invalid api log level: %s", upperLevel)
|
||||||
|
}
|
||||||
|
upperApiLevels = append(upperApiLevels, level)
|
||||||
|
}
|
||||||
|
if len(upperApiLevels) == 0 {
|
||||||
|
upperApiLevels = []LogLevel{INFO, ERROR, WARNING}
|
||||||
|
}
|
||||||
|
if slices.Contains(upperLevels, DISABLED) && slices.Contains(upperApiLevels, DISABLED) {
|
||||||
|
// both disabled, not creating a logger
|
||||||
|
loggers = []*Logger{}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
outputStdout := strings.ToUpper(output)
|
||||||
|
if outputStdout == "STDOUT" {
|
||||||
|
output = ""
|
||||||
|
}
|
||||||
|
if output == "" && stdOutLoggerExists {
|
||||||
|
// stdout logger already exists... don't create another
|
||||||
|
return fmt.Errorf("stdout logger already exists, could not set config levels=[%v] apiLevels=[%v] noColors=[%v]", levels, apiLevels, noColors)
|
||||||
|
}
|
||||||
|
// Create the logger
|
||||||
|
logger, err := NewLogger(output, upperLevels, upperApiLevels, noColors)
|
||||||
|
if err != nil {
|
||||||
|
loggers = []*Logger{}
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
loggers = append(loggers, logger)
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
func SplitByMultiple(str string) []string {
|
||||||
|
delimiters := []rune{'|', ',', ' '}
|
||||||
|
return strings.FieldsFunc(str, func(r rune) bool {
|
||||||
|
for _, d := range delimiters {
|
||||||
|
if r == d {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
})
|
||||||
|
}
|
|
@ -0,0 +1,137 @@
|
||||||
|
package logger
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"slices"
|
||||||
|
)
|
||||||
|
|
||||||
|
type LogLevel int
|
||||||
|
|
||||||
|
const (
|
||||||
|
DISABLED LogLevel = 0
|
||||||
|
ERROR LogLevel = 1
|
||||||
|
FATAL LogLevel = 1
|
||||||
|
WARNING LogLevel = 2
|
||||||
|
INFO LogLevel = 3
|
||||||
|
DEBUG LogLevel = 4
|
||||||
|
API LogLevel = 10
|
||||||
|
// COLORS
|
||||||
|
RED = "\033[31m"
|
||||||
|
GREEN = "\033[32m"
|
||||||
|
YELLOW = "\033[33m"
|
||||||
|
GRAY = "\033[37m"
|
||||||
|
)
|
||||||
|
|
||||||
|
var (
|
||||||
|
loggers []*Logger
|
||||||
|
)
|
||||||
|
|
||||||
|
type levelConsts struct {
|
||||||
|
INFO string
|
||||||
|
FATAL string
|
||||||
|
ERROR string
|
||||||
|
WARNING string
|
||||||
|
DEBUG string
|
||||||
|
API string
|
||||||
|
DISABLED string
|
||||||
|
}
|
||||||
|
|
||||||
|
var levels = levelConsts{
|
||||||
|
INFO: "INFO",
|
||||||
|
FATAL: "FATAL",
|
||||||
|
ERROR: "ERROR",
|
||||||
|
WARNING: "WARN ", // with consistent space padding
|
||||||
|
DEBUG: "DEBUG",
|
||||||
|
DISABLED: "DISABLED",
|
||||||
|
API: "API",
|
||||||
|
}
|
||||||
|
|
||||||
|
// stringToLevel maps string representation to LogLevel
|
||||||
|
var stringToLevel = map[string]LogLevel{
|
||||||
|
"DEBUG": DEBUG,
|
||||||
|
"INFO": INFO,
|
||||||
|
"ERROR": ERROR,
|
||||||
|
"DISABLED": DISABLED,
|
||||||
|
"WARN ": WARNING, // with consistent space padding
|
||||||
|
"FATAL": FATAL,
|
||||||
|
"API": API,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log prints a log message if its level is greater than or equal to the logger's levels
|
||||||
|
func Log(level string, msg string, prefix, api bool, color string) {
|
||||||
|
LEVEL := stringToLevel[level]
|
||||||
|
for _, logger := range loggers {
|
||||||
|
if api {
|
||||||
|
if logger.disabledAPI || !slices.Contains(logger.apiLevels, LEVEL) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if logger.disabled || !slices.Contains(logger.levels, LEVEL) {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if logger.stdout && LEVEL == FATAL {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
writeOut := msg
|
||||||
|
if prefix {
|
||||||
|
writeOut = fmt.Sprintf("[%s] ", level) + writeOut
|
||||||
|
}
|
||||||
|
if logger.colors && color != "" {
|
||||||
|
writeOut = color + writeOut + "\033[0m"
|
||||||
|
}
|
||||||
|
err := logger.logger.Output(3, writeOut) // 3 skips this function for correct file:line
|
||||||
|
if err != nil {
|
||||||
|
log.Printf("failed to log message '%v' with error `%v`", msg, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Api(msg string, statusCode int) {
|
||||||
|
if statusCode >= 300 && statusCode < 500 {
|
||||||
|
Log(levels.WARNING, msg, false, true, YELLOW)
|
||||||
|
} else if statusCode >= 500 {
|
||||||
|
Log(levels.ERROR, msg, false, true, RED)
|
||||||
|
} else {
|
||||||
|
Log(levels.INFO, msg, false, true, GREEN)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper methods for specific log levels
|
||||||
|
func Debug(msg string) {
|
||||||
|
if len(loggers) > 0 {
|
||||||
|
Log(levels.DEBUG, msg, true, false, GRAY)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Info(msg string) {
|
||||||
|
if len(loggers) > 0 {
|
||||||
|
Log(levels.INFO, msg, false, false, "")
|
||||||
|
} else {
|
||||||
|
log.Println(msg)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Warning(msg string) {
|
||||||
|
if len(loggers) > 0 {
|
||||||
|
Log(levels.WARNING, msg, true, false, YELLOW)
|
||||||
|
} else {
|
||||||
|
log.Println("[WARN ]: " + msg)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Error(msg string) {
|
||||||
|
if len(loggers) > 0 {
|
||||||
|
Log(levels.ERROR, msg, true, false, RED)
|
||||||
|
} else {
|
||||||
|
log.Println("[ERROR] : ", msg)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func Fatal(msg string) {
|
||||||
|
if len(loggers) > 0 {
|
||||||
|
Log(levels.FATAL, msg, true, false, RED)
|
||||||
|
}
|
||||||
|
log.Fatal("[FATAL] : ", msg)
|
||||||
|
}
|
|
@ -1,17 +0,0 @@
|
||||||
#!/bin/sh
|
|
||||||
## TEST file used by docker testing containers
|
|
||||||
checkExit() {
|
|
||||||
if [ "$?" -ne 0 ];then
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
|
|
||||||
if command -v go &> /dev/null
|
|
||||||
then
|
|
||||||
printf "\n == Running tests == \n"
|
|
||||||
go test -race -parallel -v ./...
|
|
||||||
checkExit
|
|
||||||
else
|
|
||||||
echo "ERROR: unable to perform tests"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
|
@ -1,14 +0,0 @@
|
||||||
#!/usr/bin/env bash
|
|
||||||
set -e
|
|
||||||
|
|
||||||
if ! [ -x "$(command -v standard-version)" ]; then
|
|
||||||
echo "standard-version is not installed. please run 'npm i -g standard-version'"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
standard-version --dry-run --skip
|
|
||||||
read -p "Continue (y/n)? " -n 1 -r
|
|
||||||
echo ;
|
|
||||||
if [[ $REPLY =~ ^[Yy]$ ]]; then
|
|
||||||
standard-version -s ;
|
|
||||||
fi
|
|
|
@ -1,11 +0,0 @@
|
||||||
#!/usr/bin/env bash
|
|
||||||
set -e
|
|
||||||
|
|
||||||
if ! [ -x "$(command -v commitlint)" ]; then
|
|
||||||
echo "commitlint is not installed. please run 'npm i -g commitlint'"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
for commit_hash in $(git log --pretty=format:%H origin/master..HEAD); do
|
|
||||||
commitlint -f ${commit_hash}~1 -t ${commit_hash}
|
|
||||||
done
|
|
|
@ -8,6 +8,7 @@ import (
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/users"
|
"github.com/gtsteffaniak/filebrowser/backend/users"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/version"
|
"github.com/gtsteffaniak/filebrowser/backend/version"
|
||||||
)
|
)
|
||||||
|
@ -15,29 +16,32 @@ import (
|
||||||
var Config Settings
|
var Config Settings
|
||||||
|
|
||||||
func Initialize(configFile string) {
|
func Initialize(configFile string) {
|
||||||
yamlData := loadConfigFile(configFile)
|
yamlData, err := loadConfigFile(configFile)
|
||||||
Config = setDefaults()
|
|
||||||
err := yaml.Unmarshal(yamlData, &Config)
|
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("Error unmarshaling YAML data: %v", err)
|
logger.Warning(fmt.Sprintf("Could not load config file '%v', using default settings: %v", configFile, err))
|
||||||
|
}
|
||||||
|
Config = setDefaults()
|
||||||
|
err = yaml.Unmarshal(yamlData, &Config)
|
||||||
|
if err != nil {
|
||||||
|
logger.Fatal(fmt.Sprintf("Error unmarshaling YAML data: %v", err))
|
||||||
}
|
}
|
||||||
Config.UserDefaults.Perm = Config.UserDefaults.Permissions
|
Config.UserDefaults.Perm = Config.UserDefaults.Permissions
|
||||||
// Convert relative path to absolute path
|
// Convert relative path to absolute path
|
||||||
if len(Config.Server.Sources) > 0 {
|
if len(Config.Server.Sources) > 0 {
|
||||||
// TODO allow multipe sources not named default
|
// TODO allow multipe sources not named default
|
||||||
for _, source := range Config.Server.Sources {
|
for _, source := range Config.Server.Sources {
|
||||||
realPath, err := filepath.Abs(source.Path)
|
realPath, err2 := filepath.Abs(source.Path)
|
||||||
if err != nil {
|
if err2 != nil {
|
||||||
log.Fatalf("Error getting source path: %v", err)
|
logger.Fatal(fmt.Sprintf("Error getting source path: %v", err2))
|
||||||
}
|
}
|
||||||
source.Path = realPath
|
source.Path = realPath
|
||||||
source.Name = "default" // Modify the local copy of the map value
|
source.Name = "default" // Modify the local copy of the map value
|
||||||
Config.Server.Sources["default"] = source // Assign the modified value back to the map
|
Config.Server.Sources["default"] = source // Assign the modified value back to the map
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
realPath, err := filepath.Abs(Config.Server.Root)
|
realPath, err2 := filepath.Abs(Config.Server.Root)
|
||||||
if err != nil {
|
if err2 != nil {
|
||||||
log.Fatalf("Error getting source path: %v", err)
|
logger.Fatal(fmt.Sprintf("Error getting source path: %v", err2))
|
||||||
}
|
}
|
||||||
Config.Server.Sources = map[string]Source{
|
Config.Server.Sources = map[string]Source{
|
||||||
"default": {
|
"default": {
|
||||||
|
@ -67,28 +71,46 @@ func Initialize(configFile string) {
|
||||||
Url: "https://github.com/gtsteffaniak/filebrowser/wiki",
|
Url: "https://github.com/gtsteffaniak/filebrowser/wiki",
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
if len(Config.Server.Logging) == 0 {
|
||||||
|
Config.Server.Logging = []LogConfig{
|
||||||
|
{
|
||||||
|
Output: "stdout",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for _, logConfig := range Config.Server.Logging {
|
||||||
|
err = logger.SetupLogger(
|
||||||
|
logConfig.Output,
|
||||||
|
logConfig.Levels,
|
||||||
|
logConfig.ApiLevels,
|
||||||
|
logConfig.NoColors,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
log.Println("[ERROR] Failed to set up logger:", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
func loadConfigFile(configFile string) []byte {
|
func loadConfigFile(configFile string) ([]byte, error) {
|
||||||
// Open and read the YAML file
|
// Open and read the YAML file
|
||||||
yamlFile, err := os.Open(configFile)
|
yamlFile, err := os.Open(configFile)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Println(err)
|
return nil, err
|
||||||
os.Exit(1)
|
|
||||||
}
|
}
|
||||||
defer yamlFile.Close()
|
defer yamlFile.Close()
|
||||||
|
|
||||||
stat, err := yamlFile.Stat()
|
stat, err := yamlFile.Stat()
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("error getting file information: %s", err.Error())
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
yamlData := make([]byte, stat.Size())
|
yamlData := make([]byte, stat.Size())
|
||||||
_, err = yamlFile.Read(yamlData)
|
_, err = yamlFile.Read(yamlData)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("Error reading YAML data: %v", err)
|
return nil, err
|
||||||
}
|
}
|
||||||
return yamlData
|
return yamlData, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
func setDefaults() Settings {
|
func setDefaults() Settings {
|
||||||
|
@ -101,7 +123,6 @@ func setDefaults() Settings {
|
||||||
NumImageProcessors: 4,
|
NumImageProcessors: 4,
|
||||||
BaseURL: "",
|
BaseURL: "",
|
||||||
Database: "database.db",
|
Database: "database.db",
|
||||||
Log: "stdout",
|
|
||||||
Root: ".",
|
Root: ".",
|
||||||
},
|
},
|
||||||
Auth: Auth{
|
Auth: Auth{
|
||||||
|
|
|
@ -35,7 +35,7 @@ func Test_loadConfigFile(t *testing.T) {
|
||||||
}
|
}
|
||||||
for _, tt := range tests {
|
for _, tt := range tests {
|
||||||
t.Run(tt.name, func(t *testing.T) {
|
t.Run(tt.name, func(t *testing.T) {
|
||||||
if got := loadConfigFile(tt.args.configFile); !reflect.DeepEqual(got, tt.want) {
|
if got, _ := loadConfigFile(tt.args.configFile); !reflect.DeepEqual(got, tt.want) {
|
||||||
t.Errorf("loadConfigFile() = %v, want %v", got, tt.want)
|
t.Errorf("loadConfigFile() = %v, want %v", got, tt.want)
|
||||||
}
|
}
|
||||||
})
|
})
|
||||||
|
|
|
@ -3,12 +3,13 @@ package settings
|
||||||
import (
|
import (
|
||||||
"errors"
|
"errors"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
"os"
|
"os"
|
||||||
"path"
|
"path"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"regexp"
|
"regexp"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
)
|
)
|
||||||
|
|
||||||
var (
|
var (
|
||||||
|
@ -22,7 +23,7 @@ func (s *Settings) MakeUserDir(username, userScope, serverRoot string) (string,
|
||||||
if userScope == "" && s.Server.CreateUserDir {
|
if userScope == "" && s.Server.CreateUserDir {
|
||||||
username = cleanUsername(username)
|
username = cleanUsername(username)
|
||||||
if username == "" || username == "-" || username == "." {
|
if username == "" || username == "-" || username == "." {
|
||||||
log.Printf("create user: invalid user for home dir creation: [%s]", username)
|
logger.Error(fmt.Sprintf("create user: invalid user for home dir creation: [%s]", username))
|
||||||
return "", errors.New("invalid user for home dir creation")
|
return "", errors.New("invalid user for home dir creation")
|
||||||
}
|
}
|
||||||
userScope = path.Join(s.Server.UserHomeBasePath, username)
|
userScope = path.Join(s.Server.UserHomeBasePath, username)
|
||||||
|
|
|
@ -1,22 +1,23 @@
|
||||||
package settings
|
package settings
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"log"
|
"fmt"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/goccy/go-yaml"
|
"github.com/goccy/go-yaml"
|
||||||
"github.com/google/go-cmp/cmp"
|
"github.com/google/go-cmp/cmp"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestConfigLoadChanged(t *testing.T) {
|
func TestConfigLoadChanged(t *testing.T) {
|
||||||
yamlData := loadConfigFile("./testingConfig.yaml")
|
yamlData, _ := loadConfigFile("./testingConfig.yaml")
|
||||||
// Marshal the YAML data to a more human-readable format
|
// Marshal the YAML data to a more human-readable format
|
||||||
newConfig := setDefaults()
|
newConfig := setDefaults()
|
||||||
Config := setDefaults()
|
Config := setDefaults()
|
||||||
|
|
||||||
err := yaml.Unmarshal(yamlData, &newConfig)
|
err := yaml.Unmarshal(yamlData, &newConfig)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("Error unmarshaling YAML data: %v", err)
|
logger.Fatal(fmt.Sprintf("Error unmarshaling YAML data: %v", err))
|
||||||
}
|
}
|
||||||
// Use go-cmp to compare the two structs
|
// Use go-cmp to compare the two structs
|
||||||
if diff := cmp.Diff(newConfig, Config); diff == "" {
|
if diff := cmp.Diff(newConfig, Config); diff == "" {
|
||||||
|
@ -25,14 +26,14 @@ func TestConfigLoadChanged(t *testing.T) {
|
||||||
}
|
}
|
||||||
|
|
||||||
func TestConfigLoadSpecificValues(t *testing.T) {
|
func TestConfigLoadSpecificValues(t *testing.T) {
|
||||||
yamlData := loadConfigFile("./testingConfig.yaml")
|
yamlData, _ := loadConfigFile("./testingConfig.yaml")
|
||||||
// Marshal the YAML data to a more human-readable format
|
// Marshal the YAML data to a more human-readable format
|
||||||
newConfig := setDefaults()
|
newConfig := setDefaults()
|
||||||
Config := setDefaults()
|
Config := setDefaults()
|
||||||
|
|
||||||
err := yaml.Unmarshal(yamlData, &newConfig)
|
err := yaml.Unmarshal(yamlData, &newConfig)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("Error unmarshaling YAML data: %v", err)
|
logger.Fatal(fmt.Sprintf("Error unmarshaling YAML data: %v", err))
|
||||||
}
|
}
|
||||||
testCases := []struct {
|
testCases := []struct {
|
||||||
fieldName string
|
fieldName string
|
||||||
|
|
|
@ -13,6 +13,7 @@ type Settings struct {
|
||||||
Frontend Frontend `json:"frontend"`
|
Frontend Frontend `json:"frontend"`
|
||||||
Users []UserDefaults `json:"users,omitempty"`
|
Users []UserDefaults `json:"users,omitempty"`
|
||||||
UserDefaults UserDefaults `json:"userDefaults"`
|
UserDefaults UserDefaults `json:"userDefaults"`
|
||||||
|
Integrations Integrations `json:"integrations"`
|
||||||
}
|
}
|
||||||
|
|
||||||
type Auth struct {
|
type Auth struct {
|
||||||
|
@ -47,13 +48,33 @@ type Server struct {
|
||||||
Port int `json:"port"`
|
Port int `json:"port"`
|
||||||
BaseURL string `json:"baseURL"`
|
BaseURL string `json:"baseURL"`
|
||||||
Address string `json:"address"`
|
Address string `json:"address"`
|
||||||
Log string `json:"log"`
|
Logging []LogConfig `json:"logging"`
|
||||||
Database string `json:"database"`
|
Database string `json:"database"`
|
||||||
Root string `json:"root"`
|
Root string `json:"root"`
|
||||||
UserHomeBasePath string `json:"userHomeBasePath"`
|
UserHomeBasePath string `json:"userHomeBasePath"`
|
||||||
CreateUserDir bool `json:"createUserDir"`
|
CreateUserDir bool `json:"createUserDir"`
|
||||||
Sources map[string]Source `json:"sources"`
|
Sources map[string]Source `json:"sources"`
|
||||||
ExternalUrl string `json:"externalUrl"`
|
ExternalUrl string `json:"externalUrl"`
|
||||||
|
InternalUrl string `json:"internalUrl"` // used by integrations
|
||||||
|
}
|
||||||
|
|
||||||
|
type Integrations struct {
|
||||||
|
OnlyOffice OnlyOffice `json:"office"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// onlyoffice secret is stored in the local.json file
|
||||||
|
// docker exec <containerID> /var/www/onlyoffice/documentserver/npm/json -f /etc/onlyoffice/documentserver/local.json 'services.CoAuthoring.secret.session.string'
|
||||||
|
type OnlyOffice struct {
|
||||||
|
Url string `json:"url"`
|
||||||
|
Secret string `json:"secret"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type LogConfig struct {
|
||||||
|
Levels string `json:"levels"`
|
||||||
|
ApiLevels string `json:"apiLevels"`
|
||||||
|
Output string `json:"output"`
|
||||||
|
NoColors bool `json:"noColors"`
|
||||||
|
Json bool `json:"json"`
|
||||||
}
|
}
|
||||||
|
|
||||||
type Source struct {
|
type Source struct {
|
||||||
|
|
|
@ -10,7 +10,6 @@ server:
|
||||||
port: 80
|
port: 80
|
||||||
baseURL: "/"
|
baseURL: "/"
|
||||||
address: ""
|
address: ""
|
||||||
log: "stdout"
|
|
||||||
database: "mydb.db"
|
database: "mydb.db"
|
||||||
root: "/srv"
|
root: "/srv"
|
||||||
auth:
|
auth:
|
||||||
|
|
|
@ -1,13 +1,14 @@
|
||||||
package bolt
|
package bolt
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"log"
|
"fmt"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
"github.com/asdine/storm/v3"
|
"github.com/asdine/storm/v3"
|
||||||
"github.com/asdine/storm/v3/q"
|
"github.com/asdine/storm/v3/q"
|
||||||
|
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/share"
|
"github.com/gtsteffaniak/filebrowser/backend/share"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -68,7 +69,7 @@ func (s shareBackend) Gets(path string, id uint) ([]*share.Link, error) {
|
||||||
if v[i].Expire < time.Now().Unix() {
|
if v[i].Expire < time.Now().Unix() {
|
||||||
err = s.Delete(v[i].PasswordHash)
|
err = s.Delete(v[i].PasswordHash)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Println("expired share could not be deleted: ", err.Error())
|
logger.Error(fmt.Sprintf("expired share could not be deleted: %v", err.Error()))
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
filteredList = append(filteredList, v[i])
|
filteredList = append(filteredList, v[i])
|
||||||
|
|
|
@ -2,7 +2,6 @@ package storage
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
|
||||||
|
@ -10,6 +9,7 @@ import (
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/auth"
|
"github.com/gtsteffaniak/filebrowser/backend/auth"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
"github.com/gtsteffaniak/filebrowser/backend/errors"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/files"
|
"github.com/gtsteffaniak/filebrowser/backend/files"
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
"github.com/gtsteffaniak/filebrowser/backend/settings"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/share"
|
"github.com/gtsteffaniak/filebrowser/backend/share"
|
||||||
"github.com/gtsteffaniak/filebrowser/backend/storage/bolt"
|
"github.com/gtsteffaniak/filebrowser/backend/storage/bolt"
|
||||||
|
@ -118,15 +118,15 @@ func CreateUser(userInfo users.User, asAdmin bool) error {
|
||||||
// create new home directory
|
// create new home directory
|
||||||
userHome, err := settings.Config.MakeUserDir(newUser.Username, newUser.Scope, files.RootPaths["default"])
|
userHome, err := settings.Config.MakeUserDir(newUser.Username, newUser.Scope, files.RootPaths["default"])
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Printf("create user: failed to mkdir user home dir: [%s]", userHome)
|
logger.Error(fmt.Sprintf("create user: failed to mkdir user home dir: [%s]", userHome))
|
||||||
return err
|
return err
|
||||||
}
|
}
|
||||||
newUser.Scope = userHome
|
newUser.Scope = userHome
|
||||||
log.Printf("user: %s, home dir: [%s].", newUser.Username, userHome)
|
logger.Debug(fmt.Sprintf("user: %s, home dir: [%s].", newUser.Username, userHome))
|
||||||
idx := files.GetIndex("default")
|
idx := files.GetIndex("default")
|
||||||
_, _, err = idx.GetRealPath(newUser.Scope)
|
_, _, err = idx.GetRealPath(newUser.Scope)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Println("user path is not valid", newUser.Scope)
|
logger.Error(fmt.Sprintf("user path is not valid: %v", newUser.Scope))
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
err = store.Users.Save(&newUser)
|
err = store.Users.Save(&newUser)
|
||||||
|
|
|
@ -1171,30 +1171,37 @@ const docTemplate = `{
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"files": {
|
"files": {
|
||||||
|
"description": "files in the directory",
|
||||||
"type": "array",
|
"type": "array",
|
||||||
"items": {
|
"items": {
|
||||||
"$ref": "#/definitions/files.ItemInfo"
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"folders": {
|
"folders": {
|
||||||
|
"description": "folders in the directory",
|
||||||
"type": "array",
|
"type": "array",
|
||||||
"items": {
|
"items": {
|
||||||
"$ref": "#/definitions/files.ItemInfo"
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"modified": {
|
"modified": {
|
||||||
|
"description": "modification time",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"name": {
|
"name": {
|
||||||
|
"description": "name of the file",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"path": {
|
"path": {
|
||||||
|
"description": "path scoped to the associated index",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"size": {
|
"size": {
|
||||||
|
"description": "length in bytes for regular files",
|
||||||
"type": "integer"
|
"type": "integer"
|
||||||
},
|
},
|
||||||
"type": {
|
"type": {
|
||||||
|
"description": "type of the file, either \"directory\" or a file mimetype",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1203,15 +1210,19 @@ const docTemplate = `{
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"modified": {
|
"modified": {
|
||||||
|
"description": "modification time",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"name": {
|
"name": {
|
||||||
|
"description": "name of the file",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"size": {
|
"size": {
|
||||||
|
"description": "length in bytes for regular files",
|
||||||
"type": "integer"
|
"type": "integer"
|
||||||
},
|
},
|
||||||
"type": {
|
"type": {
|
||||||
|
"description": "type of the file, either \"directory\" or a file mimetype",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1160,30 +1160,37 @@
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"files": {
|
"files": {
|
||||||
|
"description": "files in the directory",
|
||||||
"type": "array",
|
"type": "array",
|
||||||
"items": {
|
"items": {
|
||||||
"$ref": "#/definitions/files.ItemInfo"
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"folders": {
|
"folders": {
|
||||||
|
"description": "folders in the directory",
|
||||||
"type": "array",
|
"type": "array",
|
||||||
"items": {
|
"items": {
|
||||||
"$ref": "#/definitions/files.ItemInfo"
|
"$ref": "#/definitions/files.ItemInfo"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"modified": {
|
"modified": {
|
||||||
|
"description": "modification time",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"name": {
|
"name": {
|
||||||
|
"description": "name of the file",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"path": {
|
"path": {
|
||||||
|
"description": "path scoped to the associated index",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"size": {
|
"size": {
|
||||||
|
"description": "length in bytes for regular files",
|
||||||
"type": "integer"
|
"type": "integer"
|
||||||
},
|
},
|
||||||
"type": {
|
"type": {
|
||||||
|
"description": "type of the file, either \"directory\" or a file mimetype",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -1192,15 +1199,19 @@
|
||||||
"type": "object",
|
"type": "object",
|
||||||
"properties": {
|
"properties": {
|
||||||
"modified": {
|
"modified": {
|
||||||
|
"description": "modification time",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"name": {
|
"name": {
|
||||||
|
"description": "name of the file",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
},
|
},
|
||||||
"size": {
|
"size": {
|
||||||
|
"description": "length in bytes for regular files",
|
||||||
"type": "integer"
|
"type": "integer"
|
||||||
},
|
},
|
||||||
"type": {
|
"type": {
|
||||||
|
"description": "type of the file, either \"directory\" or a file mimetype",
|
||||||
"type": "string"
|
"type": "string"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -2,33 +2,44 @@ definitions:
|
||||||
files.FileInfo:
|
files.FileInfo:
|
||||||
properties:
|
properties:
|
||||||
files:
|
files:
|
||||||
|
description: files in the directory
|
||||||
items:
|
items:
|
||||||
$ref: '#/definitions/files.ItemInfo'
|
$ref: '#/definitions/files.ItemInfo'
|
||||||
type: array
|
type: array
|
||||||
folders:
|
folders:
|
||||||
|
description: folders in the directory
|
||||||
items:
|
items:
|
||||||
$ref: '#/definitions/files.ItemInfo'
|
$ref: '#/definitions/files.ItemInfo'
|
||||||
type: array
|
type: array
|
||||||
modified:
|
modified:
|
||||||
|
description: modification time
|
||||||
type: string
|
type: string
|
||||||
name:
|
name:
|
||||||
|
description: name of the file
|
||||||
type: string
|
type: string
|
||||||
path:
|
path:
|
||||||
|
description: path scoped to the associated index
|
||||||
type: string
|
type: string
|
||||||
size:
|
size:
|
||||||
|
description: length in bytes for regular files
|
||||||
type: integer
|
type: integer
|
||||||
type:
|
type:
|
||||||
|
description: type of the file, either "directory" or a file mimetype
|
||||||
type: string
|
type: string
|
||||||
type: object
|
type: object
|
||||||
files.ItemInfo:
|
files.ItemInfo:
|
||||||
properties:
|
properties:
|
||||||
modified:
|
modified:
|
||||||
|
description: modification time
|
||||||
type: string
|
type: string
|
||||||
name:
|
name:
|
||||||
|
description: name of the file
|
||||||
type: string
|
type: string
|
||||||
size:
|
size:
|
||||||
|
description: length in bytes for regular files
|
||||||
type: integer
|
type: integer
|
||||||
type:
|
type:
|
||||||
|
description: type of the file, either "directory" or a file mimetype
|
||||||
type: string
|
type: string
|
||||||
type: object
|
type: object
|
||||||
files.SearchResult:
|
files.SearchResult:
|
||||||
|
|
|
@ -2,17 +2,20 @@ package utils
|
||||||
|
|
||||||
import (
|
import (
|
||||||
"crypto/rand"
|
"crypto/rand"
|
||||||
|
"crypto/sha256"
|
||||||
|
"encoding/hex"
|
||||||
"fmt"
|
"fmt"
|
||||||
"log"
|
|
||||||
math "math/rand"
|
math "math/rand"
|
||||||
"reflect"
|
"reflect"
|
||||||
"strings"
|
"strings"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
|
"github.com/gtsteffaniak/filebrowser/backend/logger"
|
||||||
)
|
)
|
||||||
|
|
||||||
func CheckErr(source string, err error) {
|
func CheckErr(source string, err error) {
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("%s: %v", source, err)
|
logger.Fatal(fmt.Sprintf("%s: %v", source, err))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -33,7 +36,7 @@ func CapitalizeFirst(s string) string {
|
||||||
return strings.ToUpper(string(s[0])) + s[1:]
|
return strings.ToUpper(string(s[0])) + s[1:]
|
||||||
}
|
}
|
||||||
|
|
||||||
func GenerateRandomHash(length int) string {
|
func InsecureRandomIdentifier(length int) string {
|
||||||
const charset = "abcdefghijklmnopqrstuvwxyz0123456789"
|
const charset = "abcdefghijklmnopqrstuvwxyz0123456789"
|
||||||
math.New(math.NewSource(time.Now().UnixNano()))
|
math.New(math.NewSource(time.Now().UnixNano()))
|
||||||
result := make([]byte, length)
|
result := make([]byte, length)
|
||||||
|
@ -49,7 +52,7 @@ func PrintStructFields(v interface{}) {
|
||||||
|
|
||||||
// Ensure the input is a struct
|
// Ensure the input is a struct
|
||||||
if val.Kind() != reflect.Struct {
|
if val.Kind() != reflect.Struct {
|
||||||
fmt.Println("Provided value is not a struct")
|
logger.Debug("Provided value is not a struct")
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -66,7 +69,7 @@ func PrintStructFields(v interface{}) {
|
||||||
fieldValue = fieldValue[:100] + "..."
|
fieldValue = fieldValue[:100] + "..."
|
||||||
}
|
}
|
||||||
|
|
||||||
fmt.Printf("Field: %s, %s\n", fieldType.Name, fieldValue)
|
logger.Debug(fmt.Sprintf("Field: %s, %s\n", fieldType.Name, fieldValue))
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -84,3 +87,8 @@ func GetParentDirectoryPath(path string) string {
|
||||||
}
|
}
|
||||||
return path[:lastSlash]
|
return path[:lastSlash]
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func HashSHA256(data string) string {
|
||||||
|
bytes := sha256.Sum256([]byte(data))
|
||||||
|
return hex.EncodeToString(bytes[:])
|
||||||
|
}
|
||||||
|
|
|
@ -10,9 +10,9 @@ async function globalSetup() {
|
||||||
await page.getByPlaceholder("Password").fill("admin");
|
await page.getByPlaceholder("Password").fill("admin");
|
||||||
await page.getByRole("button", { name: "Login" }).click();
|
await page.getByRole("button", { name: "Login" }).click();
|
||||||
await page.waitForURL("**/files/", { timeout: 100 });
|
await page.waitForURL("**/files/", { timeout: 100 });
|
||||||
await expect(page).toHaveTitle('FileBrowser Quantum - Files');
|
|
||||||
let cookies = await context.cookies();
|
let cookies = await context.cookies();
|
||||||
expect(cookies.find((c) => c.name == "auth")?.value).toBeDefined();
|
expect(cookies.find((c) => c.name == "auth")?.value).toBeDefined();
|
||||||
|
await expect(page).toHaveTitle('playwright-files - FileBrowser Quantum - Files');
|
||||||
await page.context().storageState({ path: "./loginAuth.json" });
|
await page.context().storageState({ path: "./loginAuth.json" });
|
||||||
await browser.close();
|
await browser.close();
|
||||||
}
|
}
|
||||||
|
|
|
@ -20,7 +20,7 @@
|
||||||
"test": "vitest run "
|
"test": "vitest run "
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@playwright/test": "^1.49.1",
|
"@onlyoffice/document-editor-vue": "^1.4.0",
|
||||||
"ace-builds": "^1.24.2",
|
"ace-builds": "^1.24.2",
|
||||||
"clipboard": "^2.0.4",
|
"clipboard": "^2.0.4",
|
||||||
"css-vars-ponyfill": "^2.4.3",
|
"css-vars-ponyfill": "^2.4.3",
|
||||||
|
@ -33,6 +33,7 @@
|
||||||
"vue-router": "^4.3.0"
|
"vue-router": "^4.3.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
|
"@playwright/test": "^1.49.1",
|
||||||
"@intlify/unplugin-vue-i18n": "^4.0.0",
|
"@intlify/unplugin-vue-i18n": "^4.0.0",
|
||||||
"@vitejs/plugin-vue": "^5.0.4",
|
"@vitejs/plugin-vue": "^5.0.4",
|
||||||
"@vue/eslint-config-typescript": "^13.0.0",
|
"@vue/eslint-config-typescript": "^13.0.0",
|
||||||
|
|
|
@ -25,7 +25,7 @@ export default defineConfig({
|
||||||
reporter: "line",
|
reporter: "line",
|
||||||
/* Shared settings for all the projects below. See https://playwright.dev/docs/api/class-testoptions. */
|
/* Shared settings for all the projects below. See https://playwright.dev/docs/api/class-testoptions. */
|
||||||
use: {
|
use: {
|
||||||
actionTimeout: 500,
|
actionTimeout: 5000,
|
||||||
storageState: "loginAuth.json",
|
storageState: "loginAuth.json",
|
||||||
/* Base URL to use in actions like `await page.goto('/')`. */
|
/* Base URL to use in actions like `await page.goto('/')`. */
|
||||||
baseURL: "http://127.0.0.1",
|
baseURL: "http://127.0.0.1",
|
||||||
|
|
|
@ -13,7 +13,7 @@
|
||||||
|
|
||||||
<link rel="icon" type="image/png" sizes="256x256" href="{{ .StaticURL }}/img/icons/favicon-256x256.png">
|
<link rel="icon" type="image/png" sizes="256x256" href="{{ .StaticURL }}/img/icons/favicon-256x256.png">
|
||||||
|
|
||||||
<link href="https://fonts.googleapis.com/css?family=Material+Icons|Material+Symbols+Outlined" rel="stylesheet">
|
<!--<link href="https://fonts.googleapis.com/css?family=Material+Icons|Material+Symbols+Outlined" rel="stylesheet">-->
|
||||||
|
|
||||||
<!-- Add to home screen for Android and modern mobile browsers -->
|
<!-- Add to home screen for Android and modern mobile browsers -->
|
||||||
<link rel="manifest" id="manifestPlaceholder" crossorigin="use-credentials">
|
<link rel="manifest" id="manifestPlaceholder" crossorigin="use-credentials">
|
||||||
|
|
|
@ -2,6 +2,7 @@ import { fetchURL, adjustedData } from "./utils";
|
||||||
import { removePrefix, getApiPath } from "@/utils/url.js";
|
import { removePrefix, getApiPath } from "@/utils/url.js";
|
||||||
import { state } from "@/store";
|
import { state } from "@/store";
|
||||||
import { notify } from "@/notify";
|
import { notify } from "@/notify";
|
||||||
|
import { externalUrl } from "@/utils/constants";
|
||||||
|
|
||||||
// Notify if errors occur
|
// Notify if errors occur
|
||||||
export async function fetchFiles(url, content = false) {
|
export async function fetchFiles(url, content = false) {
|
||||||
|
@ -167,13 +168,16 @@ export async function checksum(url, algo) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function getDownloadURL(path, inline) {
|
export function getDownloadURL(path, inline, useExternal) {
|
||||||
try {
|
try {
|
||||||
const params = {
|
const params = {
|
||||||
files: encodeURIComponent(removePrefix(decodeURI(path),"files")),
|
files: encodeURIComponent(removePrefix(decodeURI(path),"files")),
|
||||||
...(inline && { inline: "true" }),
|
...(inline && { inline: "true" }),
|
||||||
};
|
};
|
||||||
const apiPath = getApiPath("api/raw", params);
|
const apiPath = getApiPath("api/raw", params);
|
||||||
|
if (externalUrl && useExternal) {
|
||||||
|
return externalUrl+apiPath
|
||||||
|
}
|
||||||
return window.origin+apiPath
|
return window.origin+apiPath
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
notify.showError(err.message || "Error getting download URL");
|
notify.showError(err.message || "Error getting download URL");
|
||||||
|
|
Binary file not shown.
Binary file not shown.
|
@ -86,7 +86,6 @@ over
|
||||||
/* Main Content */
|
/* Main Content */
|
||||||
main {
|
main {
|
||||||
position: fixed;
|
position: fixed;
|
||||||
padding: .5em;
|
|
||||||
padding-top: 4em;
|
padding-top: 4em;
|
||||||
overflow: scroll;
|
overflow: scroll;
|
||||||
top: 0;
|
top: 0;
|
||||||
|
@ -95,8 +94,15 @@ main {
|
||||||
display: flex;
|
display: flex;
|
||||||
flex-direction: column;
|
flex-direction: column;
|
||||||
}
|
}
|
||||||
|
|
||||||
main > div {
|
main > div {
|
||||||
height: calc(100% - 3em);
|
height: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.main-padding {
|
||||||
|
padding: 0.5em;
|
||||||
|
padding-top: 4em;
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.breadcrumbs {
|
.breadcrumbs {
|
||||||
|
|
|
@ -166,3 +166,48 @@
|
||||||
src: local('Roboto Bold'), local('Roboto-Bold'), url(../assets/fonts/roboto/bold-latin.woff2) format('woff2');
|
src: local('Roboto Bold'), local('Roboto-Bold'), url(../assets/fonts/roboto/bold-latin.woff2) format('woff2');
|
||||||
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
|
unicode-range: U+0000-00FF, U+0131, U+0152-0153, U+02C6, U+02DA, U+02DC, U+2000-206F, U+2074, U+20AC, U+2212, U+2215, U+E0FF, U+EFFD, U+F000;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* fallback */
|
||||||
|
@font-face {
|
||||||
|
font-family: 'Material Icons';
|
||||||
|
font-style: normal;
|
||||||
|
font-weight: 400;
|
||||||
|
src: url(../assets/fonts/material/icons.woff2) format('woff2');
|
||||||
|
}
|
||||||
|
/* fallback */
|
||||||
|
@font-face {
|
||||||
|
font-family: 'Material Symbols Outlined';
|
||||||
|
font-style: normal;
|
||||||
|
font-weight: 400;
|
||||||
|
src: url(../assets/fonts/material/symbols-outlined.woff2) format('woff');
|
||||||
|
}
|
||||||
|
|
||||||
|
.material-icons {
|
||||||
|
font-family: 'Material Icons';
|
||||||
|
font-weight: normal;
|
||||||
|
font-style: normal;
|
||||||
|
font-size: 24px;
|
||||||
|
line-height: 1;
|
||||||
|
letter-spacing: normal;
|
||||||
|
text-transform: none;
|
||||||
|
display: inline-block;
|
||||||
|
white-space: nowrap;
|
||||||
|
word-wrap: normal;
|
||||||
|
direction: ltr;
|
||||||
|
-webkit-font-smoothing: antialiased;
|
||||||
|
}
|
||||||
|
|
||||||
|
.material-symbols-outlined {
|
||||||
|
font-family: 'Material Symbols Outlined';
|
||||||
|
font-weight: normal;
|
||||||
|
font-style: normal;
|
||||||
|
font-size: 24px;
|
||||||
|
line-height: 1;
|
||||||
|
letter-spacing: normal;
|
||||||
|
text-transform: none;
|
||||||
|
display: inline-block;
|
||||||
|
white-space: nowrap;
|
||||||
|
word-wrap: normal;
|
||||||
|
direction: ltr;
|
||||||
|
-webkit-font-smoothing: antialiased;
|
||||||
|
}
|
||||||
|
|
|
@ -100,7 +100,7 @@ export const getters = {
|
||||||
if (typeof getters.currentPromptName() === "string" && !getters.isStickySidebar()) {
|
if (typeof getters.currentPromptName() === "string" && !getters.isStickySidebar()) {
|
||||||
visible = false;
|
visible = false;
|
||||||
}
|
}
|
||||||
if (getters.currentView() == "editor" || getters.currentView() == "preview") {
|
if (getters.currentView() == "editor" || getters.currentView() == "preview" || getters.currentView() == "onlyOfficeEditor") {
|
||||||
visible = false;
|
visible = false;
|
||||||
}
|
}
|
||||||
return visible
|
return visible
|
||||||
|
@ -123,6 +123,9 @@ export const getters = {
|
||||||
const shouldOverlaySidebar = getters.isSidebarVisible() && !getters.isStickySidebar()
|
const shouldOverlaySidebar = getters.isSidebarVisible() && !getters.isStickySidebar()
|
||||||
return hasPrompt || shouldOverlaySidebar;
|
return hasPrompt || shouldOverlaySidebar;
|
||||||
},
|
},
|
||||||
|
showBreadCrumbs: () => {
|
||||||
|
return getters.currentView() == "listingView" ;
|
||||||
|
},
|
||||||
routePath: (trimModifier="") => {
|
routePath: (trimModifier="") => {
|
||||||
return removePrefix(state.route.path,trimModifier)
|
return removePrefix(state.route.path,trimModifier)
|
||||||
},
|
},
|
||||||
|
@ -136,6 +139,8 @@ export const getters = {
|
||||||
if (state.req.type !== undefined) {
|
if (state.req.type !== undefined) {
|
||||||
if (state.req.type == "directory") {
|
if (state.req.type == "directory") {
|
||||||
return "listingView";
|
return "listingView";
|
||||||
|
} else if (state.req?.onlyOfficeId) {
|
||||||
|
return "onlyOfficeEditor";
|
||||||
} else if ("content" in state.req) {
|
} else if ("content" in state.req) {
|
||||||
return "editor";
|
return "editor";
|
||||||
} else {
|
} else {
|
||||||
|
|
|
@ -140,21 +140,16 @@ export const mutations = {
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
},
|
},
|
||||||
addSelected: (value) => {
|
addSelected: (value) => {
|
||||||
console.log("addSelected", value)
|
|
||||||
state.selected.push(value);
|
state.selected.push(value);
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
},
|
},
|
||||||
removeSelected: (value) => {
|
removeSelected: (value) => {
|
||||||
console.log("removeSelected", value)
|
|
||||||
|
|
||||||
let i = state.selected.indexOf(value);
|
let i = state.selected.indexOf(value);
|
||||||
if (i === -1) return;
|
if (i === -1) return;
|
||||||
state.selected.splice(i, 1);
|
state.selected.splice(i, 1);
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
},
|
},
|
||||||
resetSelected: () => {
|
resetSelected: () => {
|
||||||
console.log("resetSelected")
|
|
||||||
|
|
||||||
state.selected = [];
|
state.selected = [];
|
||||||
mutations.setMultiple(false);
|
mutations.setMultiple(false);
|
||||||
emitStateChanged();
|
emitStateChanged();
|
||||||
|
|
|
@ -7,6 +7,7 @@ import { recaptcha, loginPage } from "@/utils/constants";
|
||||||
|
|
||||||
export async function setNewToken(token) {
|
export async function setNewToken(token) {
|
||||||
document.cookie = `auth=${token}; path=/`;
|
document.cookie = `auth=${token}; path=/`;
|
||||||
|
mutations.setJWT(token);
|
||||||
mutations.setSession(generateRandomCode(8));
|
mutations.setSession(generateRandomCode(8));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -44,7 +45,6 @@ export async function login(username, password, recaptcha) {
|
||||||
}
|
}
|
||||||
|
|
||||||
export async function renew(jwt) {
|
export async function renew(jwt) {
|
||||||
console.log("Renewing token");
|
|
||||||
let apiPath = getApiPath("api/auth/renew")
|
let apiPath = getApiPath("api/auth/renew")
|
||||||
const res = await fetch(apiPath, {
|
const res = await fetch(apiPath, {
|
||||||
method: "POST",
|
method: "POST",
|
||||||
|
|
|
@ -18,6 +18,7 @@ const enableThumbs = window.FileBrowser.EnableThumbs;
|
||||||
const resizePreview = window.FileBrowser.ResizePreview;
|
const resizePreview = window.FileBrowser.ResizePreview;
|
||||||
const enableExec = window.FileBrowser.EnableExec;
|
const enableExec = window.FileBrowser.EnableExec;
|
||||||
const externalUrl = window.FileBrowser.ExternalUrl
|
const externalUrl = window.FileBrowser.ExternalUrl
|
||||||
|
const onlyOfficeUrl = window.FileBrowser.OnlyOfficeUrl
|
||||||
const origin = window.location.origin;
|
const origin = window.location.origin;
|
||||||
|
|
||||||
const settings = [
|
const settings = [
|
||||||
|
@ -49,5 +50,6 @@ export {
|
||||||
enableExec,
|
enableExec,
|
||||||
origin,
|
origin,
|
||||||
darkMode,
|
darkMode,
|
||||||
settings
|
settings,
|
||||||
|
onlyOfficeUrl,
|
||||||
};
|
};
|
||||||
|
|
|
@ -4,3 +4,13 @@ export default function (name) {
|
||||||
);
|
);
|
||||||
return document.cookie.replace(re, "$1");
|
return document.cookie.replace(re, "$1");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function getCookie(name) {
|
||||||
|
let cookie = document.cookie
|
||||||
|
.split(";")
|
||||||
|
.find((cookie) => cookie.includes(name + "="));
|
||||||
|
if (cookie != null) {
|
||||||
|
return cookie.split("=")[1];
|
||||||
|
}
|
||||||
|
return ""
|
||||||
|
}
|
|
@ -3,8 +3,6 @@ import url from "@/utils/url.js";
|
||||||
import { filesApi } from "@/api";
|
import { filesApi } from "@/api";
|
||||||
|
|
||||||
export function checkConflict(files, items) {
|
export function checkConflict(files, items) {
|
||||||
console.log("testing",files)
|
|
||||||
|
|
||||||
if (typeof items === "undefined" || items === null) {
|
if (typeof items === "undefined" || items === null) {
|
||||||
items = [];
|
items = [];
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
<template>
|
<template>
|
||||||
<div>
|
<div>
|
||||||
<breadcrumbs base="/files" />
|
<breadcrumbs v-if="showBreadCrumbs" base="/files" />
|
||||||
<errors v-if="error" :errorCode="error.status" />
|
<errors v-if="error" :errorCode="error.status" />
|
||||||
<component v-else-if="currentViewLoaded" :is="currentView"></component>
|
<component v-else-if="currentViewLoaded" :is="currentView"></component>
|
||||||
<div v-else>
|
<div v-else>
|
||||||
|
@ -23,6 +23,7 @@ import Errors from "@/views/Errors.vue";
|
||||||
import Preview from "@/views/files/Preview.vue";
|
import Preview from "@/views/files/Preview.vue";
|
||||||
import ListingView from "@/views/files/ListingView.vue";
|
import ListingView from "@/views/files/ListingView.vue";
|
||||||
import Editor from "@/views/files/Editor.vue";
|
import Editor from "@/views/files/Editor.vue";
|
||||||
|
import OnlyOfficeEditor from "./files/OnlyOfficeEditor.vue";
|
||||||
import { state, mutations, getters } from "@/store";
|
import { state, mutations, getters } from "@/store";
|
||||||
import { url } from "@/utils";
|
import { url } from "@/utils";
|
||||||
import { notify } from "@/notify";
|
import { notify } from "@/notify";
|
||||||
|
@ -36,6 +37,7 @@ export default {
|
||||||
Preview,
|
Preview,
|
||||||
ListingView,
|
ListingView,
|
||||||
Editor,
|
Editor,
|
||||||
|
OnlyOfficeEditor,
|
||||||
},
|
},
|
||||||
data() {
|
data() {
|
||||||
return {
|
return {
|
||||||
|
@ -46,6 +48,9 @@ export default {
|
||||||
};
|
};
|
||||||
},
|
},
|
||||||
computed: {
|
computed: {
|
||||||
|
showBreadCrumbs() {
|
||||||
|
return getters.showBreadCrumbs();
|
||||||
|
},
|
||||||
currentView() {
|
currentView() {
|
||||||
return getters.currentView();
|
return getters.currentView();
|
||||||
},
|
},
|
||||||
|
@ -110,8 +115,10 @@ export default {
|
||||||
// If not a directory, fetch content
|
// If not a directory, fetch content
|
||||||
if (res.type != "directory") {
|
if (res.type != "directory") {
|
||||||
let content = false;
|
let content = false;
|
||||||
// only check content for blob or text files
|
if (
|
||||||
if (res.type.startsWith("application") || res.type.startsWith("text")) {
|
!res.onlyOfficeId &&
|
||||||
|
(res.type.startsWith("application") || res.type.startsWith("text"))
|
||||||
|
) {
|
||||||
content = true;
|
content = true;
|
||||||
}
|
}
|
||||||
res = await filesApi.fetchFiles(getters.routePath(), content);
|
res = await filesApi.fetchFiles(getters.routePath(), content);
|
||||||
|
|
|
@ -20,7 +20,7 @@
|
||||||
<defaultBar v-else :class="{ 'dark-mode-header': isDarkMode }"></defaultBar>
|
<defaultBar v-else :class="{ 'dark-mode-header': isDarkMode }"></defaultBar>
|
||||||
<sidebar></sidebar>
|
<sidebar></sidebar>
|
||||||
<search v-if="showSearch"></search>
|
<search v-if="showSearch"></search>
|
||||||
<main :class="{ 'dark-mode': isDarkMode, moveWithSidebar: moveWithSidebar }">
|
<main :class="{ 'dark-mode': isDarkMode, moveWithSidebar: moveWithSidebar, 'main-padding': showPadding }">
|
||||||
<router-view></router-view>
|
<router-view></router-view>
|
||||||
</main>
|
</main>
|
||||||
<prompts :class="{ 'dark-mode': isDarkMode }"></prompts>
|
<prompts :class="{ 'dark-mode': isDarkMode }"></prompts>
|
||||||
|
@ -71,6 +71,9 @@ export default {
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
computed: {
|
computed: {
|
||||||
|
showPadding() {
|
||||||
|
return getters.showBreadCrumbs();
|
||||||
|
},
|
||||||
showSearch() {
|
showSearch() {
|
||||||
return getters.isLoggedIn() && this.currentView == "listingView";
|
return getters.isLoggedIn() && this.currentView == "listingView";
|
||||||
},
|
},
|
||||||
|
|
|
@ -3,6 +3,7 @@
|
||||||
<action v-if="notShare" icon="close" :label="$t('buttons.close')" @action="close()" />
|
<action v-if="notShare" icon="close" :label="$t('buttons.close')" @action="close()" />
|
||||||
<title v-if="isSettings" class="topTitle">Settings</title>
|
<title v-if="isSettings" class="topTitle">Settings</title>
|
||||||
<title v-else class="topTitle">{{ req.name }}</title>
|
<title v-else class="topTitle">{{ req.name }}</title>
|
||||||
|
<action icon="hide_source" />
|
||||||
</header>
|
</header>
|
||||||
</template>
|
</template>
|
||||||
|
|
||||||
|
@ -36,10 +37,12 @@ export default {
|
||||||
mutations.closeHovers();
|
mutations.closeHovers();
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
mutations.replaceRequest({});
|
|
||||||
let uri = url.removeLastDir(state.route.path) + "/";
|
|
||||||
router.push({ path: uri });
|
|
||||||
mutations.closeHovers();
|
mutations.closeHovers();
|
||||||
|
setTimeout(() => {
|
||||||
|
mutations.replaceRequest({});
|
||||||
|
let uri = url.removeLastDir(state.route.path) + "/";
|
||||||
|
router.push({ path: uri });
|
||||||
|
}, 50);
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
|
@ -9,6 +9,10 @@
|
||||||
:label="$t('buttons.save')"
|
:label="$t('buttons.save')"
|
||||||
@action="save()"
|
@action="save()"
|
||||||
/>
|
/>
|
||||||
|
<action
|
||||||
|
v-else
|
||||||
|
icon="hide_source"
|
||||||
|
/>
|
||||||
</header>
|
</header>
|
||||||
</template>
|
</template>
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,56 @@
|
||||||
|
<template>
|
||||||
|
<!-- Conditionally render the DocumentEditor component -->
|
||||||
|
<DocumentEditor
|
||||||
|
v-if="ready"
|
||||||
|
id="docEditor"
|
||||||
|
:documentServerUrl="onlyOfficeUrl"
|
||||||
|
:config="clientConfig"
|
||||||
|
:onLoadComponentError="onLoadComponentError"
|
||||||
|
/>
|
||||||
|
<div v-else>
|
||||||
|
<p>Loading editor...</p>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<script lang="ts">
|
||||||
|
import { DocumentEditor } from "@onlyoffice/document-editor-vue";
|
||||||
|
import { onlyOfficeUrl } from "@/utils/constants";
|
||||||
|
import { state } from "@/store";
|
||||||
|
import { fetchJSON } from "@/api/utils";
|
||||||
|
import { filesApi } from "@/api";
|
||||||
|
import { baseURL } from "@/utils/constants";
|
||||||
|
|
||||||
|
export default {
|
||||||
|
name: "onlyOfficeEditor",
|
||||||
|
components: {
|
||||||
|
DocumentEditor,
|
||||||
|
},
|
||||||
|
data() {
|
||||||
|
return {
|
||||||
|
ready: false, // Flag to indicate whether the setup is complete
|
||||||
|
clientConfig: {},
|
||||||
|
};
|
||||||
|
},
|
||||||
|
computed: {
|
||||||
|
req() {
|
||||||
|
return state.req;
|
||||||
|
},
|
||||||
|
onlyOfficeUrl() {
|
||||||
|
return onlyOfficeUrl;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
async mounted() {
|
||||||
|
// Perform the setup and update the config
|
||||||
|
try {
|
||||||
|
const refUrl = await filesApi.getDownloadURL(state.req.path, false, true);
|
||||||
|
let configData = await fetchJSON(baseURL + `api/onlyoffice/config?url=${refUrl}`);
|
||||||
|
configData.type = state.isMobile ? "mobile" : "desktop";
|
||||||
|
this.clientConfig = configData;
|
||||||
|
this.ready = true;
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Error during setup:", error);
|
||||||
|
// Handle setup failure if needed
|
||||||
|
}
|
||||||
|
},
|
||||||
|
};
|
||||||
|
</script>
|
|
@ -110,7 +110,7 @@ export default {
|
||||||
event.preventDefault();
|
event.preventDefault();
|
||||||
try {
|
try {
|
||||||
if (this.isNew) {
|
if (this.isNew) {
|
||||||
await usersApi.create(this.userPayload);
|
await usersApi.create(this.userPayload); // Use the computed property
|
||||||
this.$router.push({ path: "/settings", hash: "#users-main" });
|
this.$router.push({ path: "/settings", hash: "#users-main" });
|
||||||
notify.showSuccess(this.$t("settings.userCreated"));
|
notify.showSuccess(this.$t("settings.userCreated"));
|
||||||
} else {
|
} else {
|
||||||
|
|
3
makefile
3
makefile
|
@ -51,7 +51,8 @@ test-backend:
|
||||||
test-frontend:
|
test-frontend:
|
||||||
cd frontend && npm run test
|
cd frontend && npm run test
|
||||||
|
|
||||||
test-playwright: run-frontend build-backend
|
test-playwright: run-frontend
|
||||||
|
cd backend && GOOS=linux go build -o filebrowser . && cd .. && \
|
||||||
docker build -t filebrowser-playwright-tests -f Dockerfile.playwright .
|
docker build -t filebrowser-playwright-tests -f Dockerfile.playwright .
|
||||||
docker run --rm --name filebrowser-playwright-tests filebrowser-playwright-tests
|
docker run --rm --name filebrowser-playwright-tests filebrowser-playwright-tests
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue