Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
f41747d
Pull new dcm package file and zip checksum #4949
Nov 1, 2018
c8f80f0
Pull checksum from dcm uploaded .sha #5053
Nov 2, 2018
4b55eba
Revert "Pull checksum from dcm uploaded .sha #5053"
Nov 2, 2018
cbea0c6
Pull checksum from dcm uploaded .sha #4949
Nov 2, 2018
9284ba9
Move file info to new fragment #4949
Nov 7, 2018
0cdc100
Generate s3 identifier on import #4949
Nov 8, 2018
f391e63
Popup with s3 package download url #4949
Nov 15, 2018
cc29b93
Merge branch 'develop' into 4949-download-package-s3
Nov 15, 2018
36b8648
Launch s3Package popup from guestbook #4949
Nov 15, 2018
e7d7dcd
dcm s3 file page start #4949
Nov 19, 2018
756de2b
Share file entry code from replace and s3 #4949
Nov 28, 2018
23196c0
dcm s3 download code cleanup #4949
Nov 29, 2018
8827430
Cleaned up UI of new popup for S3 download [ref #4949]
mheppler Nov 29, 2018
1802ef0
Added placeholder User Guide section for S3 download, and link from p…
mheppler Nov 30, 2018
51069be
Fixed header formatting issue in User Guide for S3 download [ref #4949]
mheppler Nov 30, 2018
5da9584
dcms3 download friendly size display #4949
Nov 30, 2018
75fabe9
Merge branch '4949-download-package-s3' of https://github.com/IQSS/da…
Nov 30, 2018
dbc13d5
Present correct script name dcms3 #4949
Nov 30, 2018
02a6622
dataset w. package not look for rsync script #4949
Nov 30, 2018
930a77f
Merge branch 'develop' into 4949-download-package-s3
Nov 30, 2018
42e5977
code cleanup and better bundle naming #4949
Nov 30, 2018
ddf52d1
Package Download User Guide Section [#4949]
dlmurphy Nov 30, 2018
1a53227
Update to docs - Wget isn't magic [#4949]
dlmurphy Nov 30, 2018
1db38db
dcms3 download only with downloadMethod #4949
Nov 30, 2018
a6d4f96
remove s3 naming from package code #4949
Nov 30, 2018
1938862
move&format dcm s3 docs
Dec 4, 2018
f109aaf
Switch file info frag name #4949
Dec 5, 2018
5e0a0fb
code cleanup and test fixing #4949
Dec 5, 2018
ea76617
More doc cleanup #4949
Dec 5, 2018
2f12cd8
Merge branch 'develop' into 4949-download-package-s3
Dec 5, 2018
9a47958
Bump DCM version #4949
Dec 5, 2018
a170d78
Minor doc fix #4949
Dec 5, 2018
c891407
Another minor doc change #4949
Dec 5, 2018
4432cae
Fix refactor double extension #4949
matthew-a-dunlap Dec 10, 2018
5537175
dcms3 code review docfixes #4949
Dec 13, 2018
bf31f9e
dcms3 install fixes and enable restrict #4949
Dec 13, 2018
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion conf/docker-aio/c7.dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ FROM centos:7
RUN yum install -y https://download.postgresql.org/pub/repos/yum/9.6/redhat/rhel-7-x86_64/pgdg-centos96-9.6-3.noarch.rpm
#RUN yum install -y java-1.8.0-openjdk-headless postgresql-server sudo epel-release unzip perl curl httpd
RUN yum install -y java-1.8.0-openjdk-devel postgresql96-server sudo epel-release unzip perl curl httpd
RUN yum install -y jq lsof
RUN yum install -y jq lsof awscli

# copy and unpack dependencies (solr, glassfish)
COPY dv /tmp/dv
Expand Down
3 changes: 1 addition & 2 deletions conf/docker-dcm/0prep.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
#!/bin/sh


wget https://github.com/sbgrid/data-capture-module/releases/download/0.3/dcm-0.3-0.noarch.rpm
wget https://github.com/sbgrid/data-capture-module/releases/download/0.5/dcm-0.5-0.noarch.rpm
3 changes: 2 additions & 1 deletion conf/docker-dcm/dcmsrv.dockerfile
Original file line number Diff line number Diff line change
@@ -1,13 +1,14 @@
# build from repo root
FROM centos:6
RUN yum install -y epel-release
ARG RPMFILE=dcm-0.3-0.noarch.rpm
ARG RPMFILE=dcm-0.5-0.noarch.rpm
COPY ${RPMFILE} /tmp/
COPY bashrc /root/.bashrc
COPY test_install.sh /root/
RUN yum localinstall -y /tmp/${RPMFILE}
RUN pip install -r /opt/dcm/requirements.txt
RUN pip install awscli==1.15.75
run export PATH=~/.local/bin:$PATH
RUN /root/test_install.sh
COPY rq-init-d /etc/init.d/rq
RUN useradd glassfish
Expand Down
128 changes: 121 additions & 7 deletions doc/sphinx-guides/source/developers/big-data-support.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,7 @@ Install a DCM

Installation instructions can be found at https://github.com/sbgrid/data-capture-module . Note that a shared filesystem (posix or AWS S3) between Dataverse and your DCM is required. You cannot use a DCM with Swift at this point in time.

Please note that S3 support for DCM is highly experimental. Files can be uploaded to S3 but they cannot be downloaded until https://github.com/IQSS/dataverse/issues/4949 is worked on. If you want to play around with S3 support for DCM, you must configure a JVM option called ``dataverse.files.dcm-s3-bucket-name`` which is a holding area for uploaded files that have not yet passed checksum validation. Search for that JVM option at https://github.com/IQSS/dataverse/issues/4703 for commands on setting that JVM option and related setup. Note that because that GitHub issue has so many comments you will need to click "Load more" where it says "hidden items". FIXME: Document all of this properly.

. FIXME: Explain what ``dataverse.files.dcm-s3-bucket-name`` is for and what it has to do with ``dataverse.files.s3-bucket-name``.
.. FIXME: Explain what ``dataverse.files.dcm-s3-bucket-name`` is for and what it has to do with ``dataverse.files.s3-bucket-name``.

Once you have installed a DCM, you will need to configure two database settings on the Dataverse side. These settings are documented in the :doc:`/installation/config` section of the Installation Guide:

Expand Down Expand Up @@ -61,7 +59,6 @@ Steps to set up a DCM mock for Development

Install Flask.


Download and run the mock. You will be cloning the https://github.com/sbgrid/data-capture-module repo.

- ``git clone git://github.com/sbgrid/data-capture-module.git``
Expand Down Expand Up @@ -108,6 +105,123 @@ The following low level command should only be used when troubleshooting the "im

``curl -H "X-Dataverse-key: $API_TOKEN" -X POST "$DV_BASE_URL/api/batch/jobs/import/datasets/files/$DATASET_DB_ID?uploadFolder=$UPLOAD_FOLDER&totalSize=$TOTAL_SIZE"``

Steps to set up a DCM via Docker for Development
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

If you need a fully operating DCM client for development purposes, these steps will guide you to setting one up. This includes steps to set up the DCM on S3 variant.

Docker Image Set-up
^^^^^^^^^^^^^^^^^^^

- Install docker if you do not have it
- Follow these steps (extracted from ``docker-aio/readme.md`` & ``docker-dcm/readme.txt``) :

- ``cd conf/docker-aio`` and run ``./0prep_deps.sh`` to create Glassfish and Solr tarballs in conf/docker-aio/dv/deps.
- Run ``./1prep.sh``
- Build the docker image: ``docker build -t dv0 -f c7.dockerfile .``
- ``cd ../docker-dcm`` and run ``./0prep.sh``
- Build dcm/dv0dcm images with docker-compose: ``docker-compose -f docker-compose.yml build``
- Start containers: ``docker-compose -f docker-compose.yml up -d``
- Wait for container to show "healthy" (aka - ``docker ps``), then wait another 5 minutes (even though it shows healthy, glassfish is still standing itself up). Then run Dataverse app installation: ``docker exec -it dvsrv /opt/dv/install.bash``
- Configure Dataverse application to use DCM (run from outside the container): ``docker exec -it dvsrv /opt/dv/configure_dcm.sh``
- The Dataverse installation is accessible at ``http://localhost:8084``.
- You may need to change the DoiProvider inside dvsrv (ezid does not work):

- ``curl -X DELETE -d EZID "localhost:8080/api/admin/settings/:DoiProvider"``
- ``curl -X PUT -d DataCite "localhost:8080/api/admin/settings/:DoiProvider"``
- Also change the doi.baseUrlString, doi.username, doi.password

Optional steps for setting up the S3 Docker DCM Variant
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

- Before: the default bucket for DCM to hold files in S3 is named test-dcm. It is coded into `post_upload_s3.bash` (line 30). Change to a different bucket if needed.
- Add AWS bucket info to dcmsrv

- You need a credentials files in ~/.aws

- ``mkdir ~/.aws``
- ``yum install nano`` (or use a different editor below)
- ``nano ~/.aws/credentials`` and add these contents with your keys:

- ``[default]``
- ``aws_access_key_id =``
- ``aws_secret_access_key =``

- Dataverse configuration (on dvsrv):

- Set S3 as the storage driver

- ``cd /opt/glassfish4/bin/``
- ``./asadmin delete-jvm-options "\-Ddataverse.files.storage-driver-id=file"``
- ``./asadmin create-jvm-options "\-Ddataverse.files.storage-driver-id=s3"``

- Add AWS bucket info to Dataverse

- ``mkdir ~/.aws``
- ``yum install nano`` (or use a different editor below)
- ``nano ~/.aws/credentials`` and add these contents with your keys:

- ``[default]``
- ``aws_access_key_id =``
- ``aws_secret_access_key =``

- Also: ``nano ~/.aws/config`` to create a region file. Add these contents:

- ``[default]``
- ``region = us-east-1``

- Add the S3 bucket names to Dataverse

- S3 bucket for Dataverse

- ``/usr/local/glassfish4/glassfish/bin/asadmin create-jvm-options "-Ddataverse.files.s3-bucket-name=iqsstestdcmbucket"``

- S3 bucket for DCM (as Dataverse needs to do the copy over)

- ``/usr/local/glassfish4/glassfish/bin/asadmin create-jvm-options "-Ddataverse.files.dcm-s3-bucket-name=test-dcm"``

- Set download method to be HTTP, as DCM downloads through S3 are over this protocol ``curl -X PUT "http://localhost:8080/api/admin/settings/:DownloadMethods" -d "native/http"``

Using the DCM Docker Containers
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

For using these commands, you will need to connect to the shell prompt inside various containers (e.g. ``docker exec -it dvsrv /bin/bash``)

- Create a dataset and download rsync upload script
- Upload script to dcm_client (if needed, you can probably do all the actions for create/download inside dcm_client)

- ``docker cp ~/Downloads/upload-FK2_NN49YM.bash dcm_client:/tmp``

- Create a folder of files to upload (files can be empty)
- Run script

- e.g. ``bash ./upload-FK2_NN49YM.bash``

- Manually run post upload script on dcmsrv

- for posix implementation: ``bash ./opt/dcm/scn/post_upload.bash``
- for S3 implementation: ``bash ./opt/dcm/scn/post_upload_s3.bash``

Additional DCM docker development tips
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

- You can completely blow away all the docker images with these commands (including non DCM ones!)

- ``docker stop dvsrv``
- ``docker stop dcm_client``
- ``docker stop dcmsrv``
- ``docker rm $(docker ps -a -q)``
- ``docker rmi $(docker images -q)``

- There are a few logs to tail

- dvsrv : ``tail -n 2000 -f /opt/glassfish4/glassfish/domains/domain1/logs/server.log``
- dcmsrv : ``tail -n 2000 -f /var/log/lighttpd/breakage.log``
- dcmsrv : ``tail -n 2000 -f /var/log/lighttpd/access.log``

- Note that by default the docker container will stop running if the process it is following is turned off. For example flask with dcmsrv. You can get around this by having the script being followed never close (e.g. sleep infinity) https://stackoverflow.com/questions/31870222/how-can-i-keep-container-running-on-kubernetes
- You may have to restart the glassfish domain occasionally to deal with memory filling up. If deployment is getting reallllllly slow, its a good time.

Repository Storage Abstraction Layer (RSAL)
-------------------------------------------

Expand Down Expand Up @@ -221,7 +335,7 @@ Available Steps
Dataverse has an internal step provider, whose id is ``:internal``. It offers the following steps:

log
+++
^^^

A step that writes data about the current workflow invocation to the instance log. It also writes the messages in its ``parameters`` map.

Expand All @@ -238,7 +352,7 @@ A step that writes data about the current workflow invocation to the instance lo


pause
+++++
^^^^^

A step that pauses the workflow. The workflow is paused until a POST request is sent to ``/api/workflows/{invocation-id}``.

Expand All @@ -251,7 +365,7 @@ A step that pauses the workflow. The workflow is paused until a POST request is


http/sr
+++++++
^^^^^^^

A step that sends a HTTP request to an external system, and then waits for a response. The response has to match a regular expression specified in the step parameters. The url, content type, and message body can use data from the workflow context, using a simple markup language. This step has specific parameters for rollback.

Expand Down
13 changes: 10 additions & 3 deletions doc/sphinx-guides/source/user/find-use-data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Basic Search
You can search the entire contents of the Dataverse installation, including dataverses, datasets, and files. You can access the search through the search bar on the homepage, or by clicking the magnifying glass icon in the header of every page. The search bar accepts search terms, queries, or exact phrases (in quotations).

Sorting and Viewing Search Results
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Facets: to the left of the search results, there are several facets a user can click on to narrow the number of results displayed.
- Choosing a facet: to choose a facet to narrow your results by, click on that facet.
Expand Down Expand Up @@ -81,7 +81,7 @@ You may also download a file from its file page by clicking the Download button
Tabular data files offer additional options: You can explore using the TwoRavens data visualization tool (or other :doc:`/installation/external-tools` if they have been enabled) by clicking the Explore button, or choose from a number of tabular-data-specific download options available as a dropdown under the Download button.

Tabular Data
~~~~~~~~~~~~
^^^^^^^^^^^^

Ingested files can be downloaded in several different ways.

Expand All @@ -99,7 +99,7 @@ Ingested files can be downloaded in several different ways.
.. _rsync_download:

Downloading a Dataverse Package via rsync
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

rsync is typically used for synchronizing files and directories between two different systems. Some Dataverse installations allow downloads using rsync, to facilitate large file transfers in a reliable and secure manner.

Expand All @@ -110,6 +110,13 @@ In a dataset containing a Dataverse Package, at the bottom of the dataset page,
After you've downloaded the Dataverse Package, you may want to double-check that your download went perfectly. Under **Verify Data**, you'll find a command that you can run in your terminal that will initiate a checksum to ensure that the data you downloaded matches the data in Dataverse precisely. This way, you can ensure the integrity of the data you're working with.


.. _package_download_url:

Downloading a Dataverse Package via URL
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Dataverse Packages are typically used to represent extremely large files or bundles containing a large number of files. Dataverse Packages are often too large to be reliably downloaded using a web browser. When you click to download a Dataverse Package, instead of automatically initiating the download in your web browser, Dataverse displays a plaintext URL for the location of the file. To ensure a reliable, resumable download, we recommend using `GNU Wget <https://www.gnu.org/software/wget/>`_ in a command line terminal or using a download manager software of your choice. If you try to simply paste the URL into your web browser then the download may overwhelm your browser, resulting in an interrupted, timed out, or otherwise failed download.

Explore Data
------------

Expand Down
7 changes: 6 additions & 1 deletion src/main/java/Bundle.properties
Original file line number Diff line number Diff line change
Expand Up @@ -1622,6 +1622,7 @@ file.downloadDialog.header=Dataset Terms
file.downloadDialog.tip=Please confirm and/or complete the information needed below in order to continue.
file.requestAccessTermsDialog.tip=Please confirm and/or complete the information needed below in order to request access to files in this dataset.
file.requestAccess.notAllowed=Requests for access are not accepted on the Dataset.

file.search.placeholder=Search this dataset...
file.results.btn.sort=Sort
file.results.btn.sort.option.nameAZ=Name (A-Z)
Expand Down Expand Up @@ -2145,6 +2146,10 @@ permission.ViewUnpublishedDataset.desc=View an unpublished dataset and its files
permission.ViewUnpublishedDataverse.desc=View an unpublished dataverse
permission.AddDataset.desc=Add a dataset to a dataverse

packageDownload.title=Package File Download
packageDownload.instructions=Use the Download URL in a Wget command or a download manager to download this package file. Download via web browser is not recommended. <a href="{0}/{1}/user/find-use-data.html#downloading-a-dataverse-package-via-url" title="User Guide - Downloading a Dataverse Package via URL" target="_blank">User Guide - Downloading a Dataverse Package via URL</a>
packageDownload.urlHeader=Download URL

#mydata_fragment.xhtml
Published=Published
Unpublished=Unpublished
Expand Down Expand Up @@ -2243,4 +2248,4 @@ rtabfileparser.ioexception.mismatch=Reading mismatch, line {0} of the Data file:
rtabfileparser.ioexception.boolean=Unexpected value for the Boolean variable ({0}):
rtabfileparser.ioexception.read=Couldn't read Boolean variable ({0})!
rtabfileparser.ioexception.parser1=R Tab File Parser: Could not obtain varQnty from the dataset metadata.
rtabfileparser.ioexception.parser2=R Tab File Parser: varQnty=0 in the dataset metadata!
rtabfileparser.ioexception.parser2=R Tab File Parser: varQnty=0 in the dataset metadata!
5 changes: 3 additions & 2 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

import com.amazonaws.services.lightsail.model.Bundle;
import edu.harvard.iq.dataverse.provenance.ProvPopupFragmentBean;
import edu.harvard.iq.dataverse.PackagePopupFragmentBean;
import edu.harvard.iq.dataverse.api.AbstractApiBean;
import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
Expand Down Expand Up @@ -198,7 +199,6 @@ public enum DisplayMode {
@Inject
ProvPopupFragmentBean provPopupFragmentBean;


private Dataset dataset = new Dataset();
private EditMode editMode;
private boolean bulkFileDeleteInProgress = false;
Expand Down Expand Up @@ -1469,7 +1469,8 @@ private String init(boolean initFull) {
this.guestbookResponse = guestbookResponseService.initGuestbookResponseForFragment(workingVersion, null, session);
this.getFileDownloadHelper().setGuestbookResponse(guestbookResponse);
logger.fine("Checking if rsync support is enabled.");
if (DataCaptureModuleUtil.rsyncSupportEnabled(settingsWrapper.getValueForKey(SettingsServiceBean.Key.UploadMethods))) {
if (DataCaptureModuleUtil.rsyncSupportEnabled(settingsWrapper.getValueForKey(SettingsServiceBean.Key.UploadMethods))
&& dataset.getFiles().isEmpty()) { //only check for rsync if no files exist
try {
ScriptRequestResponse scriptRequestResponse = commandEngine.submit(new RequestRsyncScriptCommand(dvRequestService.getDataverseRequest(), dataset));
logger.fine("script: " + scriptRequestResponse.getScript());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1728,7 +1728,8 @@ public void setHasRsyncScript(Boolean hasRsyncScript) {

private void setUpRsync() {
logger.fine("setUpRsync called...");
if (DataCaptureModuleUtil.rsyncSupportEnabled(settingsWrapper.getValueForKey(SettingsServiceBean.Key.UploadMethods))) {
if (DataCaptureModuleUtil.rsyncSupportEnabled(settingsWrapper.getValueForKey(SettingsServiceBean.Key.UploadMethods))
&& dataset.getFiles().isEmpty()) { //only check for rsync if no files exist
try {
ScriptRequestResponse scriptRequestResponse = commandEngine.submit(new RequestRsyncScriptCommand(dvRequestService.getDataverseRequest(), dataset));
logger.fine("script: " + scriptRequestResponse.getScript());
Expand Down
21 changes: 17 additions & 4 deletions src/main/java/edu/harvard/iq/dataverse/FileDownloadHelper.java
Original file line number Diff line number Diff line change
Expand Up @@ -297,7 +297,22 @@ public void writeGuestbookAndLaunchExploreTool(GuestbookResponse guestbookRespon
}
fileDownloadService.explore(guestbookResponse, fmd, externalTool);
requestContext.execute("PF('downloadPopup').hide()");
}
}

public void writeGuestbookAndLaunchPackagePopup(GuestbookResponse guestbookResponse) {
RequestContext requestContext = RequestContext.getCurrentInstance();
boolean valid = validateGuestbookResponse(guestbookResponse);

if (!valid) {
JH.addMessage(FacesMessage.SEVERITY_ERROR, JH.localize("dataset.message.validationError"));
} else {
requestContext.execute("PF('downloadPopup').hide()");
requestContext.execute("PF('downloadPackagePopup').show()");
requestContext.execute("handleResizeDialog('downloadPackagePopup')");

fileDownloadService.writeGuestbookResponseRecord(guestbookResponse);
}
}

public String startWorldMapDownloadLink(GuestbookResponse guestbookResponse, FileMetadata fmd){

Expand Down Expand Up @@ -336,10 +351,8 @@ public void clearRequestAccessFiles(){
public void addMultipleFilesForRequestAccess(DataFile dataFile) {
this.filesForRequestAccess.add(dataFile);

}
}



private String selectedFileId = null;

public String getSelectedFileId() {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -145,14 +145,19 @@ public void writeGuestbookAndStartFileDownload(GuestbookResponse guestbookRespon
logger.fine("issued file download redirect for datafile "+guestbookResponse.getDataFile().getId());
}

public void writeGuestbookResponseRecord(GuestbookResponse guestbookResponse, FileMetadata fileMetadata, String format) {
if(!fileMetadata.getDatasetVersion().isDraft()){
guestbookResponse = guestbookResponseService.modifyDatafileAndFormat(guestbookResponse, fileMetadata, format);
writeGuestbookResponseRecord(guestbookResponse);
}
}

public void writeGuestbookResponseRecord(GuestbookResponse guestbookResponse) {

try {
CreateGuestbookResponseCommand cmd = new CreateGuestbookResponseCommand(dvRequestService.getDataverseRequest(), guestbookResponse, guestbookResponse.getDataset());
commandEngine.submit(cmd);
} catch (CommandException e) {
//if an error occurs here then download won't happen no need for response recs...

}
}

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/

package edu.harvard.iq.dataverse;

import javax.faces.view.ViewScoped;
import javax.inject.Named;

/**
*
* @author matthew
*/

@ViewScoped
@Named
public class PackagePopupFragmentBean implements java.io.Serializable {

FileMetadata fm;

public void setFileMetadata(FileMetadata fileMetadata) {
fm = fileMetadata;
}

public FileMetadata getFileMetadata() {
return fm;
}

}

Loading